feat(copilot): Release v0.5.1 - Smarter BYOK, Tool Caching & Refined Docs
This commit is contained in:
@@ -1,32 +1,36 @@
|
||||
# GitHub Copilot SDK Pipe for OpenWebUI
|
||||
|
||||
**Author:** [Fu-Jie](https://github.com/Fu-Jie/awesome-openwebui) | **Version:** 0.3.0 | **Project:** [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) | **License:** MIT
|
||||
**Author:** [Fu-Jie](https://github.com/Fu-Jie/awesome-openwebui) | **Version:** 0.5.1 | **Project:** [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) | **License:** MIT
|
||||
|
||||
This is an advanced Pipe function for [OpenWebUI](https://github.com/open-webui/open-webui) that allows you to use GitHub Copilot models (such as `gpt-5`, `gpt-5-mini`, `claude-sonnet-4.5`) directly within OpenWebUI. It is built upon the official [GitHub Copilot SDK for Python](https://github.com/github/copilot-sdk), providing a native integration experience.
|
||||
This is an advanced Pipe function for [OpenWebUI](https://github.com/open-webui/open-webui) that integrates the official [GitHub Copilot SDK](https://github.com/github/copilot-sdk). It enables you to use **GitHub Copilot models** (e.g., `gpt-5.2-codex`, `claude-sonnet-4.5`, `gemini-3-pro`, `gpt-5-mini`) **AND** your own models via **BYOK** (OpenAI, Anthropic) directly within OpenWebUI, providing a unified agentic experience.
|
||||
|
||||
> [!IMPORTANT]
|
||||
> **Active GitHub Copilot Subscription Required**
|
||||
> This plugin requires a valid GitHub Copilot subscription (Individual, Business, or Enterprise). It will verify your subscription status during authentication.
|
||||
> [!TIP]
|
||||
> **No Subscription Required for BYOK**
|
||||
> If you are using your own API keys (BYOK mode with OpenAI/Anthropic), **you do NOT need a GitHub Copilot subscription.**
|
||||
> A subscription is only required to access GitHub's official models.
|
||||
|
||||
## 🚀 What's New (v0.3.0) - The Power of "Unified Ecosystem"
|
||||
## 🚀 What's New (v0.5.1) - Major Upgrade
|
||||
|
||||
* **🔌 Zero-Config Tool Bridge**: Automatically transforms your existing OpenWebUI Functions (Tools) into Copilot-compatible tools. **Copilot now has total access to your entire WebUI toolset!**
|
||||
* **🔗 Dynamic MCP Discovery**: Seamlessly connects to MCP servers defined in **Admin Settings -> Connections**. No configuration files required—it just works.
|
||||
* **⚡ High-Performance Async Engine**: Background CLI updates and optimized event-driven streaming ensure lightning-fast responses without UI lag.
|
||||
* **🛡️ Robust Interoperability**: Advanced sanitization and dynamic Pydantic model generation ensure smooth integration even with complex third-party tools.
|
||||
- **🧠 Smarter BYOK Detection**: Improved logic to correctly identify BYOK vs. Official Copilot models, supporting custom models (Characters/Prompts) and fixing multiplier detection (e.g., `(0x)`, `(1x)`).
|
||||
- **⚡ Performance Boost**: Implemented **Tool Caching** to persist tool definitions across requests, significantly reducing overhead.
|
||||
- **🧩 Enriched Tool Integration**: Tool descriptions now include source grouping (Built-in/User/Server) and automatic metadata extraction (Title/Description) from Python docstrings.
|
||||
- **🛡️ Precise Control**: Added support for OpenWebUI's `function_name_filter_list` to filter MCP and OpenAPI functions.
|
||||
- **🔑 User-Level BYOK**: Fully leverage Copilot SDK with your own Model Providers (OpenAI, Anthropic) with user-level API Key overrides.
|
||||
- **📝 Better Formatting**: Enforced standard Markdown tables in system prompts to prevent rendering issues with HTML tables.
|
||||
|
||||
## ✨ Key Capabilities
|
||||
|
||||
* **🌉 The Ultimate Bridge**: The first and only plugin that creates a seamless bridge between **OpenWebUI Tools** and **GitHub Copilot SDK**.
|
||||
* **🚀 Official & Native**: Built directly on the official Python SDK, providing the most stable and authentic Copilot experience.
|
||||
* **🌊 Advanced Streaming (Thought Process)**: Supports full model reasoning/thinking display with typewriter effects.
|
||||
* **🖼️ Intelligent Multimodal**: Full support for images and attachments, enabling Copilot to "see" your workspace.
|
||||
* **🛠️ Effortless Setup**: Automatic CLI detection, version enforcement, and dependency management.
|
||||
* **🔑 Dual-Layer Security**: Supports secure OAuth flow for Chat and standard PAT for extended MCP capabilities.
|
||||
- **🔑 Flexible Auth & BYOK**: Supports GitHub Copilot subscription (PAT) OR Bring Your Own Key (OpenAI/Anthropic), giving you total control over model access and billing.
|
||||
- **🌉 The Ultimate Bridge**: The first and only plugin that creates a seamless bridge between **OpenWebUI Tools** and **GitHub Copilot SDK**.
|
||||
- **🚀 Official & Native**: Built directly on the official Python SDK, providing the most stable and authentic Copilot experience.
|
||||
- **🌊 Advanced Streaming (Thought Process)**: Supports full model reasoning/thinking display with typewriter effects.
|
||||
- **🖼️ Intelligent Multimodal**: Full support for images and attachments, enabling Copilot to "see" your workspace.
|
||||
- **🛠️ Effortless Setup**: Automatic CLI detection, version enforcement, and dependency management.
|
||||
- **🛡️ Integrated Security**: Supports secure PAT authentication for standard and extended capabilities.
|
||||
|
||||
## 📦 Installation & Usage
|
||||
## Installation & Configuration
|
||||
|
||||
### 1. Import Function
|
||||
### 1) Import Function
|
||||
|
||||
1. Open OpenWebUI.
|
||||
2. Go to **Workspace** -> **Functions**.
|
||||
@@ -34,7 +38,7 @@ This is an advanced Pipe function for [OpenWebUI](https://github.com/open-webui/
|
||||
4. Paste the content of `github_copilot_sdk.py` (or `github_copilot_sdk_cn.py` for Chinese) completely.
|
||||
5. Save.
|
||||
|
||||
### 2. Configure Valves (Settings)
|
||||
### 2) Configure Valves (Settings)
|
||||
|
||||
Find "GitHub Copilot" in the function list and click the **⚙️ (Valves)** icon to configure:
|
||||
|
||||
@@ -53,9 +57,14 @@ Find "GitHub Copilot" in the function list and click the **⚙️ (Valves)** ico
|
||||
| **TIMEOUT** | Timeout for each stream chunk (seconds). | `300` |
|
||||
| **CUSTOM_ENV_VARS** | Custom environment variables (JSON format). | - |
|
||||
| **REASONING_EFFORT** | Reasoning effort level: low, medium, high. `xhigh` is supported for some models. | `medium` |
|
||||
| **ENFORCE_FORMATTING** | Add formatting instructions to system prompt for better readability. | `True` |
|
||||
| **ENABLE_MCP_SERVER** | Enable Direct MCP Client connection (Recommended). | `True` |
|
||||
| **ENABLE_OPENWEBUI_TOOLS** | Enable OpenWebUI Tools (includes defined and server tools). | `True` |
|
||||
| **BYOK_ENABLED** | Enable BYOK (Bring Your Own Key) to use custom providers. | `False` |
|
||||
| **BYOK_TYPE** | BYOK Provider Type: `openai`, `azure`, `anthropic`. | `openai` |
|
||||
| **BYOK_BASE_URL** | BYOK Base URL (e.g., `https://api.openai.com/v1`). | - |
|
||||
| **BYOK_API_KEY** | BYOK API Key (Global Setting). | - |
|
||||
| **BYOK_BEARER_TOKEN** | BYOK Bearer Token (Global, overrides API Key). | - |
|
||||
| **BYOK_WIRE_API** | BYOK Wire API: `completions`, `responses`. | `completions` |
|
||||
|
||||
#### User Valves (per-user overrides)
|
||||
|
||||
@@ -69,9 +78,13 @@ These optional settings can be set per user (overrides global Valves):
|
||||
| **SHOW_THINKING** | Show model reasoning/thinking process. | `True` |
|
||||
| **ENABLE_OPENWEBUI_TOOLS** | Enable OpenWebUI Tools (overrides global). | `True` |
|
||||
| **ENABLE_MCP_SERVER** | Enable MCP server loading (overrides global). | `True` |
|
||||
| **ENFORCE_FORMATTING** | Enforce formatting guidelines (overrides global). | `True` |
|
||||
| **BYOK_API_KEY** | BYOK API Key (User override). | - |
|
||||
|
||||
### 3. Get Token
|
||||
## ⭐ Support
|
||||
|
||||
If this plugin has been useful, a star on [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) is a big motivation for me. Thank you for the support.
|
||||
|
||||
### Get Token
|
||||
|
||||
To use GitHub Copilot, you need a GitHub Personal Access Token (PAT) with appropriate permissions.
|
||||
|
||||
@@ -81,24 +94,24 @@ To use GitHub Copilot, you need a GitHub Personal Access Token (PAT) with approp
|
||||
2. Click **Generate new token (fine-grained)**.
|
||||
3. **Repository access**: Select **Public Repositories** (simplest) or **All repositories**.
|
||||
4. **Permissions**:
|
||||
* If you chose **All repositories**, you must click **Account permissions**.
|
||||
* Find **Copilot Requests**, and select **Access**.
|
||||
- If you chose **All repositories**, you must click **Account permissions**.
|
||||
- Find **Copilot Requests**, and select **Access**.
|
||||
5. Generate and copy the Token.
|
||||
|
||||
## 📋 Dependencies
|
||||
|
||||
This Pipe will automatically attempt to install the following dependencies:
|
||||
|
||||
* `github-copilot-sdk` (Python package)
|
||||
* `github-copilot-cli` (Binary file, installed via official script)
|
||||
- `github-copilot-sdk` (Python package)
|
||||
- `github-copilot-cli` (Binary file, installed via official script)
|
||||
|
||||
## Troubleshooting ❓
|
||||
|
||||
* **Images and Multimodal Usage**:
|
||||
* Ensure `MODEL_ID` is a model that supports multimodal input.
|
||||
* **Thinking not shown**:
|
||||
* Ensure **streaming is enabled** and the selected model supports reasoning output.
|
||||
- **Images not recognized**:
|
||||
- Ensure `MODEL_ID` is a model that supports multimodal input.
|
||||
- **Thinking not shown**:
|
||||
- Ensure **streaming is enabled** and the selected model supports reasoning output.
|
||||
|
||||
## 📄 License
|
||||
## Changelog
|
||||
|
||||
MIT
|
||||
See the full history on GitHub: [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui)
|
||||
|
||||
@@ -1,30 +1,35 @@
|
||||
# GitHub Copilot SDK 官方管道
|
||||
|
||||
**作者:** [Fu-Jie](https://github.com/Fu-Jie/awesome-openwebui) | **版本:** 0.3.0 | **项目:** [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) | **许可证:** MIT
|
||||
**作者:** [Fu-Jie](https://github.com/Fu-Jie/awesome-openwebui) | **版本:** 0.5.1 | **项目:** [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) | **许可证:** MIT
|
||||
|
||||
这是一个用于 [OpenWebUI](https://github.com/open-webui/open-webui) 的高级 Pipe 函数,允许你直接在 OpenWebUI 中使用 GitHub Copilot 模型(如 `gpt-5`, `gpt-5-mini`, `claude-sonnet-4.5`)。它基于官方 [GitHub Copilot SDK for Python](https://github.com/github/copilot-sdk) 构建,提供了原生级的集成体验。
|
||||
这是一个用于 [OpenWebUI](https://github.com/open-webui/open-webui) 的高级 Pipe 函数,深度集成了 **GitHub Copilot SDK**。它不仅支持 **GitHub Copilot 官方模型**(如 `gpt-5.2-codex`, `claude-sonnet-4.5`, `gemini-3-pro`, `gpt-5-mini`),还支持 **BYOK (自带 Key)** 模式对接自定义服务商(OpenAI, Anthropic),提供统一的 Agent 交互体验。
|
||||
|
||||
> [!IMPORTANT]
|
||||
> **需 GitHub Copilot 订阅**
|
||||
> 本插件需要有效的 GitHub Copilot 订阅(个人版、商业版或企业版)。插件将在认证阶段验证您的订阅状态。
|
||||
> [!TIP]
|
||||
> **使用 BYOK 模式无需订阅**
|
||||
> 如果您使用自己的 API Key (OpenAI, Anthropic) 运行 BYOK 模式,**则完全不需要** GitHub Copilot 订阅。
|
||||
> 仅当您希望使用 GitHub 官方提供的模型时,才需要订阅。
|
||||
|
||||
## 🚀 最新特性 (v0.3.0) - “统一生态”的力量
|
||||
## 🚀 最新特性 (v0.5.1) - 重大升级
|
||||
|
||||
* **🔌 零配置工具桥接 (Unified Tool Bridge)**: 自动将您现有的 OpenWebUI Functions (工具) 转换为 Copilot 兼容工具。**Copilot 现在可以无缝调用您手头所有的 WebUI 工具!**
|
||||
* **🔗 动态 MCP 自动发现**: 直接联动 OpenWebUI **管理面板 -> 连接**。无需编写任何配置文件,即插即用,瞬间扩展 Copilot 能力边界。
|
||||
* **⚡ 高性能异步引擎**: 异步 CLI 更新检查与高度优化的事件驱动流式处理,确保对话毫秒级响应。
|
||||
* **🛡️ 卓越的兼容性**: 独创的动态 Pydantic 模型生成技术,确保复杂工具参数在 Copilot 端也能得到精准验证。
|
||||
- **🧠 智能 BYOK 检测**: 优化了 BYOK 与官方 Copilot 模型的识别逻辑,完美支持自定义模型(角色/提示词)及倍率检测(如 `(0x)`, `(1x)`)。
|
||||
- **⚡ 性能飙升**: 引入 **工具缓存 (Tool Caching)** 机制,在请求间持久化工具定义,显著降低调用开销。
|
||||
- **🧩 丰富工具集成**: 工具描述现包含来源分组(内置/用户/服务器)及 Docstring 元数据自动解析。
|
||||
- **🛡️ 精确控制**: 完美兼容 OpenWebUI 全局函数过滤配置 (`function_name_filter_list`),可精准控制暴露给 LLM 的函数。
|
||||
- **🔑 用户级 BYOK**: 支持在用户层面配置自定义 API Key 对接 AI 供应商(OpenAI, Anthropic)。
|
||||
- **📝 格式优化**: 系统提示词强制使用标准 Markdown 表格,彻底解决 HTML 表格渲染问题。
|
||||
|
||||
## ✨ 核心能力
|
||||
|
||||
* **🌉 强大的生态桥接**: 首个且唯一完美打通 **OpenWebUI Tools** 与 **GitHub Copilot SDK** 的插件。
|
||||
* **🚀 官方原生产体验**: 基于官方 Python SDK 构建,提供最稳定、最纯正的 Copilot 交互体验。
|
||||
* **🌊 深度推理展示**: 完整支持模型思考过程 (Thinking Process) 的流式渲染。
|
||||
* **🖼️ 智能多模态**: 支持图像识别与附件上传,让 Copilot 拥有视觉能力。
|
||||
* **🛠️ 极简部署流程**: 自动检测环境、自动下载 CLI、自动管理依赖,全自动化开箱即用。
|
||||
* **🔑 安全认证体系**: 完美支持 OAuth 授权与 PAT 模式,兼顾便捷与安全性。
|
||||
- **🔑 灵活鉴权与 BYOK**: 支持 GitHub Copilot 订阅 (PAT) 或自带 Key (OpenAI/Anthropic),完全掌控模型访问与计费。
|
||||
- **🌉 强大的生态桥接**: 首个且唯一完美打通 **OpenWebUI Tools** 与 **GitHub Copilot SDK** 的插件。
|
||||
- **🚀 官方原生产体验**: 基于官方 Python SDK 构建,提供最稳定、最纯正的 Copilot 交互体验。
|
||||
- **🌊 深度推理展示**: 完整支持模型思考过程 (Thinking Process) 的流式渲染。
|
||||
- **🖼️ 智能多模态**: 支持图像识别与附件上传,让 Copilot 拥有视觉能力。
|
||||
- **🛠️ 极简部署流程**: 自动检测环境、自动下载 CLI、自动管理依赖,全自动化开箱即用。
|
||||
- **🛡️ 纯净安全体系**: 支持标准 PAT 认证,确保数据安全。
|
||||
|
||||
## 📦 安装与使用
|
||||
## 安装与配置
|
||||
|
||||
### 1. 导入函数
|
||||
|
||||
@@ -53,9 +58,14 @@
|
||||
| **TIMEOUT** | 每个流式分块超时(秒)。 | `300` |
|
||||
| **CUSTOM_ENV_VARS** | 自定义环境变量 (JSON 格式)。 | - |
|
||||
| **REASONING_EFFORT** | 推理强度级别: low, medium, high. `xhigh` 仅部分模型支持。 | `medium` |
|
||||
| **ENFORCE_FORMATTING** | 在系统提示词中添加格式化指导。 | `True` |
|
||||
| **ENABLE_MCP_SERVER** | 启用直接 MCP 客户端连接 (建议)。 | `True` |
|
||||
| **ENABLE_OPENWEBUI_TOOLS** | 启用 OpenWebUI 工具 (包括自定义和服务器工具)。 | `True` |
|
||||
| **BYOK_ENABLED** | 启用 BYOK (自带 Key) 模式以使用自定义供应商。 | `False` |
|
||||
| **BYOK_TYPE** | BYOK 供应商类型: `openai`, `azure`, `anthropic`。 | `openai` |
|
||||
| **BYOK_BASE_URL** | BYOK 基础 URL (如 `https://api.openai.com/v1`)。 | - |
|
||||
| **BYOK_API_KEY** | BYOK API Key (全局设置)。 | - |
|
||||
| **BYOK_BEARER_TOKEN** | BYOK Bearer Token (全局,覆盖 API Key)。 | - |
|
||||
| **BYOK_WIRE_API** | BYOK 通信协议: `completions`, `responses`。 | `completions` |
|
||||
|
||||
#### 用户 Valves(按用户覆盖)
|
||||
|
||||
@@ -69,9 +79,12 @@
|
||||
| **SHOW_THINKING** | 是否显示思考过程。 | `True` |
|
||||
| **ENABLE_OPENWEBUI_TOOLS** | 启用 OpenWebUI 工具(覆盖全局设置)。 | `True` |
|
||||
| **ENABLE_MCP_SERVER** | 启用动态 MCP 服务器加载(覆盖全局设置)。 | `True` |
|
||||
| **ENFORCE_FORMATTING** | 强制启用格式化指导(覆盖全局设置)。 | `True` |
|
||||
|
||||
### 3. 获取 Token
|
||||
## ⭐ 支持
|
||||
|
||||
如果这个插件对你有帮助,欢迎到 [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) 点个 Star,这将是我持续改进的动力,感谢支持。
|
||||
|
||||
### 获取 Token
|
||||
|
||||
要使用 GitHub Copilot,您需要一个具有适当权限的 GitHub 个人访问令牌 (PAT)。
|
||||
|
||||
@@ -81,24 +94,24 @@
|
||||
2. 点击 **Generate new token (fine-grained)**。
|
||||
3. **Repository access**: 选择 **Public Repositories** (最简单) 或 **All repositories**。
|
||||
4. **Permissions**:
|
||||
* 如果您选择了 **All repositories**,则必须点击 **Account permissions**。
|
||||
* 找到 **Copilot Requests**,选择 **Access**。
|
||||
- 如果您选择了 **All repositories**,则必须点击 **Account permissions**。
|
||||
- 找到 **Copilot Requests**,选择 **Access**。
|
||||
5. 生成并复制令牌。
|
||||
|
||||
## 📋 依赖说明
|
||||
|
||||
该 Pipe 会自动尝试安装以下依赖(如果环境中缺失):
|
||||
|
||||
* `github-copilot-sdk` (Python 包)
|
||||
* `github-copilot-cli` (二进制文件,通过官方脚本安装)
|
||||
- `github-copilot-sdk` (Python 包)
|
||||
- `github-copilot-cli` (二进制文件,通过官方脚本安装)
|
||||
|
||||
## 故障排除 (Troubleshooting) ❓
|
||||
|
||||
* **图片及多模态使用说明**:
|
||||
* 确保 `MODEL_ID` 是支持多模态的模型。
|
||||
* **看不到思考过程**:
|
||||
* 确认已开启**流式输出**,且所选模型支持推理输出。
|
||||
- **图片及多模态使用说明**:
|
||||
- 确保 `MODEL_ID` 是支持多模态的模型。
|
||||
- **看不到思考过程**:
|
||||
- 确认已开启**流式输出**,且所选模型支持推理输出。
|
||||
|
||||
## 📄 许可证
|
||||
## 更新日志
|
||||
|
||||
MIT
|
||||
完整历史请查看 GitHub 项目: [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui)
|
||||
|
||||
@@ -15,7 +15,7 @@ Pipes allow you to:
|
||||
|
||||
## Available Pipe Plugins
|
||||
|
||||
- [GitHub Copilot SDK](github-copilot-sdk.md) (v0.3.0) - Official GitHub Copilot SDK integration. Features **zero-config OpenWebUI Tool Bridge** and **dynamic MCP discovery**. Supports streaming, multimodal, and infinite sessions.
|
||||
- [GitHub Copilot SDK](github-copilot-sdk.md) (v0.5.1) - Official GitHub Copilot SDK integration. Features **zero-config OpenWebUI Tool Bridge**, **BYOK** support, and **dynamic MCP discovery**. Supports streaming, multimodal, and infinite sessions.
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -15,7 +15,7 @@ Pipes 可以用于:
|
||||
|
||||
## 可用的 Pipe 插件
|
||||
|
||||
- [GitHub Copilot SDK](github-copilot-sdk.zh.md) (v0.3.0) - GitHub Copilot SDK 官方集成。**零配置工具桥接**与**动态 MCP 发现**。支持流式输出、多模态及无限会话。
|
||||
- [GitHub Copilot SDK](github-copilot-sdk.zh.md) (v0.5.1) - GitHub Copilot SDK 官方集成。**零配置工具桥接**与**BYOK (自带 Key) 支持**。支持流式输出、多模态及无限会话。
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -1,28 +1,32 @@
|
||||
# GitHub Copilot SDK Pipe for OpenWebUI
|
||||
|
||||
**Author:** [Fu-Jie](https://github.com/Fu-Jie/awesome-openwebui) | **Version:** 0.3.0 | **Project:** [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) | **License:** MIT
|
||||
**Author:** [Fu-Jie](https://github.com/Fu-Jie/awesome-openwebui) | **Version:** 0.5.1 | **Project:** [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) | **License:** MIT
|
||||
|
||||
This is an advanced Pipe function for [OpenWebUI](https://github.com/open-webui/open-webui) that allows you to use GitHub Copilot models (such as `gpt-5`, `gpt-5-mini`, `claude-sonnet-4.5`) directly within OpenWebUI. It is built upon the official [GitHub Copilot SDK for Python](https://github.com/github/copilot-sdk), providing a native integration experience.
|
||||
This is an advanced Pipe function for [OpenWebUI](https://github.com/open-webui/open-webui) that integrates the official [GitHub Copilot SDK](https://github.com/github/copilot-sdk). It enables you to use **GitHub Copilot models** (e.g., `gpt-5.2-codex`, `claude-sonnet-4.5`,`gemini-3-pro`, `gpt-5-mini`) **AND** your own models via **BYOK** (OpenAI, Anthropic) directly within OpenWebUI, providing a unified agentic experience.
|
||||
|
||||
> [!IMPORTANT]
|
||||
> **Active GitHub Copilot Subscription Required**
|
||||
> This plugin requires a valid GitHub Copilot subscription (Individual, Business, or Enterprise). It will verify your subscription status during authentication.
|
||||
> [!TIP]
|
||||
> **No Subscription Required for BYOK**
|
||||
> If you are using your own API keys (BYOK mode with OpenAI/Anthropic), **you do NOT need a GitHub Copilot subscription.**
|
||||
> A subscription is only required to access GitHub's official models.
|
||||
|
||||
## 🚀 What's New (v0.3.0) - The Power of "Unified Ecosystem"
|
||||
## 🚀 What's New (v0.5.1) - Major Upgrade
|
||||
|
||||
* **🔌 Zero-Config Tool Bridge**: Automatically transforms your existing OpenWebUI Functions (Tools) into Copilot-compatible tools. **Copilot now has total access to your entire WebUI toolset!**
|
||||
* **🔗 Dynamic MCP Discovery**: Seamlessly connects to MCP servers defined in **Admin Settings -> Connections**. No configuration files required—it just works.
|
||||
* **⚡ High-Performance Async Engine**: Background CLI updates and optimized event-driven streaming ensure lightning-fast responses without UI lag.
|
||||
* **🛡️ Robust Interoperability**: Advanced sanitization and dynamic Pydantic model generation ensure smooth integration even with complex third-party tools.
|
||||
- **🧠 Smarter BYOK Detection**: Improved logic to correctly identify BYOK vs. Official Copilot models, supporting custom models (Characters/Prompts) and fixing multiplier detection (e.g., `(0x)`, `(1x)`).
|
||||
- **⚡ Performance Boost**: Implemented **Tool Caching** to persist tool definitions across requests, significantly reducing overhead.
|
||||
- **🧩 Enriched Tool Integration**: Tool descriptions now include source grouping (Built-in/User/Server) and automatic metadata extraction (Title/Description) from Python docstrings.
|
||||
- **🛡️ Precise Control**: Added support for OpenWebUI's `function_name_filter_list` to filter MCP and OpenAPI functions.
|
||||
- **🔑 User-Level BYOK**: Fully leverage Copilot SDK with your own Model Providers (OpenAI, Anthropic) with user-level API Key overrides.
|
||||
- **📝 Better Formatting**: Enforced standard Markdown tables in system prompts to prevent rendering issues with HTML tables.
|
||||
|
||||
## ✨ Key Capabilities
|
||||
|
||||
* **🌉 The Ultimate Bridge**: The first and only plugin that creates a seamless bridge between **OpenWebUI Tools** and **GitHub Copilot SDK**.
|
||||
* **🚀 Official & Native**: Built directly on the official Python SDK, providing the most stable and authentic Copilot experience.
|
||||
* **🌊 Advanced Streaming (Thought Process)**: Supports full model reasoning/thinking display with typewriter effects.
|
||||
* **🖼️ Intelligent Multimodal**: Full support for images and attachments, enabling Copilot to "see" your workspace.
|
||||
* **🛠️ Effortless Setup**: Automatic CLI detection, version enforcement, and dependency management.
|
||||
* **🔑 Dual-Layer Security**: Supports secure OAuth flow for Chat and standard PAT for extended MCP capabilities.
|
||||
- **🔑 Flexible Auth & BYOK**: Supports GitHub Copilot subscription (PAT) OR Bring Your Own Key (OpenAI/Anthropic), giving you total control over model access and billing.
|
||||
- **🌉 The Ultimate Bridge**: The first and only plugin that creates a seamless bridge between **OpenWebUI Tools** and **GitHub Copilot SDK**.
|
||||
- **🚀 Official & Native**: Built directly on the official Python SDK, providing the most stable and authentic Copilot experience.
|
||||
- **🌊 Advanced Streaming (Thought Process)**: Supports full model reasoning/thinking display with typewriter effects.
|
||||
- **🖼️ Intelligent Multimodal**: Full support for images and attachments, enabling Copilot to "see" your workspace.
|
||||
- **🛠️ Effortless Setup**: Automatic CLI detection, version enforcement, and dependency management.
|
||||
- **🛡️ Integrated Security**: Supports secure PAT authentication for standard and extended capabilities.
|
||||
|
||||
## Installation & Configuration
|
||||
|
||||
@@ -53,9 +57,15 @@ Find "GitHub Copilot" in the function list and click the **⚙️ (Valves)** ico
|
||||
| **TIMEOUT** | Timeout for each stream chunk (seconds). | `300` |
|
||||
| **CUSTOM_ENV_VARS** | Custom environment variables (JSON format). | - |
|
||||
| **REASONING_EFFORT** | Reasoning effort level: low, medium, high. `xhigh` is supported for some models. | `medium` |
|
||||
| **ENFORCE_FORMATTING** | Add formatting instructions to system prompt for better readability. | `True` |
|
||||
|
||||
| **ENABLE_MCP_SERVER** | Enable Direct MCP Client connection (Recommended). | `True` |
|
||||
| **ENABLE_OPENWEBUI_TOOLS** | Enable OpenWebUI Tools (includes defined and server tools). | `True` |
|
||||
| **BYOK_ENABLED** | Enable BYOK (Bring Your Own Key) to use custom providers. | `False` |
|
||||
| **BYOK_TYPE** | BYOK Provider Type: `openai`, `azure`, `anthropic`. | `openai` |
|
||||
| **BYOK_BASE_URL** | BYOK Base URL (e.g., `https://api.openai.com/v1`). | - |
|
||||
| **BYOK_API_KEY** | BYOK API Key (Global Setting). | - |
|
||||
| **BYOK_BEARER_TOKEN** | BYOK Bearer Token (Global, overrides API Key). | - |
|
||||
| **BYOK_WIRE_API** | BYOK Wire API: `completions`, `responses`. | `completions` |
|
||||
|
||||
#### User Valves (per-user overrides)
|
||||
|
||||
@@ -69,7 +79,8 @@ These optional settings can be set per user (overrides global Valves):
|
||||
| **SHOW_THINKING** | Show model reasoning/thinking process. | `True` |
|
||||
| **ENABLE_OPENWEBUI_TOOLS** | Enable OpenWebUI Tools (overrides global). | `True` |
|
||||
| **ENABLE_MCP_SERVER** | Enable MCP server loading (overrides global). | `True` |
|
||||
| **ENFORCE_FORMATTING** | Enforce formatting guidelines (overrides global). | `True` |
|
||||
|
||||
| **BYOK_API_KEY** | BYOK API Key (User override). | - |
|
||||
|
||||
## ⭐ Support
|
||||
|
||||
@@ -85,23 +96,23 @@ To use GitHub Copilot, you need a GitHub Personal Access Token (PAT) with approp
|
||||
2. Click **Generate new token (fine-grained)**.
|
||||
3. **Repository access**: Select **Public Repositories** (simplest) or **All repositories**.
|
||||
4. **Permissions**:
|
||||
* If you chose **All repositories**, you must click **Account permissions**.
|
||||
* Find **Copilot Requests**, and select **Access**.
|
||||
- If you chose **All repositories**, you must click **Account permissions**.
|
||||
- Find **Copilot Requests**, and select **Access**.
|
||||
5. Generate and copy the Token.
|
||||
|
||||
## 📋 Dependencies
|
||||
|
||||
This Pipe will automatically attempt to install the following dependencies:
|
||||
|
||||
* `github-copilot-sdk` (Python package)
|
||||
* `github-copilot-cli` (Binary file, installed via official script)
|
||||
- `github-copilot-sdk` (Python package)
|
||||
- `github-copilot-cli` (Binary file, installed via official script)
|
||||
|
||||
## Troubleshooting ❓
|
||||
|
||||
* **Images not recognized**:
|
||||
* Ensure `MODEL_ID` is a model that supports multimodal input.
|
||||
* **Thinking not shown**:
|
||||
* Ensure **streaming is enabled** and the selected model supports reasoning output.
|
||||
- **Images not recognized**:
|
||||
- Ensure `MODEL_ID` is a model that supports multimodal input.
|
||||
- **Thinking not shown**:
|
||||
- Ensure **streaming is enabled** and the selected model supports reasoning output.
|
||||
|
||||
## Changelog
|
||||
|
||||
|
||||
@@ -1,28 +1,33 @@
|
||||
# GitHub Copilot SDK 官方管道
|
||||
|
||||
**作者:** [Fu-Jie](https://github.com/Fu-Jie/awesome-openwebui) | **版本:** 0.3.0 | **项目:** [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) | **许可证:** MIT
|
||||
**作者:** [Fu-Jie](https://github.com/Fu-Jie/awesome-openwebui) | **版本:** 0.5.1 | **项目:** [Awesome OpenWebUI](https://github.com/Fu-Jie/awesome-openwebui) | **许可证:** MIT
|
||||
|
||||
这是一个用于 [OpenWebUI](https://github.com/open-webui/open-webui) 的高级 Pipe 函数,允许你直接在 OpenWebUI 中使用 GitHub Copilot 模型(如 `gpt-5`, `gpt-5-mini`, `claude-sonnet-4.5`)。它基于官方 [GitHub Copilot SDK for Python](https://github.com/github/copilot-sdk) 构建,提供了原生级的集成体验。
|
||||
这是一个用于 [OpenWebUI](https://github.com/open-webui/open-webui) 的高级 Pipe 函数,深度集成了 **GitHub Copilot SDK**。它不仅支持 **GitHub Copilot 官方模型**(如 `gpt-5.2-codex`, `claude-sonnet-4.5`, `gemini-3-pro`, `gpt-5-mini`),还支持 **BYOK (自带 Key)** 模式对接自定义服务商(OpenAI, Anthropic),提供统一的 Agent 交互体验。
|
||||
|
||||
> [!IMPORTANT]
|
||||
> **需 GitHub Copilot 订阅**
|
||||
> 本插件需要有效的 GitHub Copilot 订阅(个人版、商业版或企业版)。插件将在认证阶段验证您的订阅状态。
|
||||
> [!TIP]
|
||||
> **使用 BYOK 模式无需订阅**
|
||||
> 如果您使用自己的 API Key (OpenAI, Anthropic) 运行 BYOK 模式,**则完全不需要** GitHub Copilot 订阅。
|
||||
> 仅当您希望使用 GitHub 官方提供的模型时,才需要订阅。
|
||||
|
||||
## 🚀 最新特性 (v0.3.0) - “统一生态”的力量
|
||||
## 🚀 最新特性 (v0.5.1) - 重大升级
|
||||
|
||||
* **🔌 零配置工具桥接 (Unified Tool Bridge)**: 自动将您现有的 OpenWebUI Functions (工具) 转换为 Copilot 兼容工具。**Copilot 现在可以无缝调用您手头所有的 WebUI 工具!**
|
||||
* **🔗 动态 MCP 自动发现**: 直接联动 OpenWebUI **管理面板 -> 连接**。无需编写任何配置文件,即插即用,瞬间扩展 Copilot 能力边界。
|
||||
* **⚡ 高性能异步引擎**: 异步 CLI 更新检查与高度优化的事件驱动流式处理,确保对话毫秒级响应。
|
||||
* **🛡️ 卓越的兼容性**: 独创的动态 Pydantic 模型生成技术,确保复杂工具参数在 Copilot 端也能得到精准验证。
|
||||
- **🧠 智能 BYOK 检测**: 优化了 BYOK 与官方 Copilot 模型的识别逻辑,完美支持自定义模型(角色/提示词)及倍率检测(如 `(0x)`, `(1x)`)。
|
||||
- **⚡ 性能飙升**: 引入 **工具缓存 (Tool Caching)** 机制,在请求间持久化工具定义,显著降低调用开销。
|
||||
- **🧩 丰富工具集成**: 工具描述现包含来源分组(内置/用户/服务器)及 Docstring 元数据自动解析。
|
||||
- **🛡️ 精确控制**: 完美兼容 OpenWebUI 全局函数过滤配置 (`function_name_filter_list`),可精准控制暴露给 LLM 的函数。
|
||||
- **🔑 用户级 BYOK**: 支持在用户层面配置自定义 API Key 对接 AI 供应商(OpenAI, Anthropic)。
|
||||
- **📝 格式优化**: 系统提示词强制使用标准 Markdown 表格,彻底解决 HTML 表格渲染问题。
|
||||
|
||||
## ✨ 核心能力
|
||||
|
||||
* **🌉 强大的生态桥接**: 首个且唯一完美打通 **OpenWebUI Tools** 与 **GitHub Copilot SDK** 的插件。
|
||||
* **🚀 官方原生产体验**: 基于官方 Python SDK 构建,提供最稳定、最纯正的 Copilot 交互体验。
|
||||
* **🌊 深度推理展示**: 完整支持模型思考过程 (Thinking Process) 的流式渲染。
|
||||
* **🖼️ 智能多模态**: 支持图像识别与附件上传,让 Copilot 拥有视觉能力。
|
||||
* **🛠️ 极简部署流程**: 自动检测环境、自动下载 CLI、自动管理依赖,全自动化开箱即用。
|
||||
* **🔑 安全认证体系**: 完美支持 OAuth 授权与 PAT 模式,兼顾便捷与安全性。
|
||||
- **🔑 灵活鉴权与 BYOK**: 支持 GitHub Copilot 订阅 (PAT) 或自带 Key (OpenAI/Anthropic),完全掌控模型访问与计费。
|
||||
- **🌉 强大的生态桥接**: 首个且唯一完美打通 **OpenWebUI Tools** 与 **GitHub Copilot SDK** 的插件。
|
||||
- **🚀 官方原生产体验**: 基于官方 Python SDK 构建,提供最稳定、最纯正的 Copilot 交互体验。
|
||||
- **🌊 深度推理展示**: 完整支持模型思考过程 (Thinking Process) 的流式渲染。
|
||||
- **🖼️ 智能多模态**: 支持图像识别与附件上传,让 Copilot 拥有视觉能力。
|
||||
- **🛠️ 极简部署流程**: 自动检测环境、自动下载 CLI、自动管理依赖,全自动化开箱即用。
|
||||
- **🛡️ 纯净安全体系**: 支持标准 PAT 认证,确保数据安全。
|
||||
|
||||
## 安装与配置
|
||||
|
||||
@@ -53,9 +58,14 @@
|
||||
| **TIMEOUT** | 每个流式分块超时(秒)。 | `300` |
|
||||
| **CUSTOM_ENV_VARS** | 自定义环境变量 (JSON 格式)。 | - |
|
||||
| **REASONING_EFFORT** | 推理强度级别: low, medium, high. `xhigh` 仅部分模型支持。 | `medium` |
|
||||
| **ENFORCE_FORMATTING** | 在系统提示词中添加格式化指导。 | `True` |
|
||||
| **ENABLE_MCP_SERVER** | 启用直接 MCP 客户端连接 (建议)。 | `True` |
|
||||
| **ENABLE_OPENWEBUI_TOOLS** | 启用 OpenWebUI 工具 (包括自定义和服务器工具)。 | `True` |
|
||||
| **BYOK_ENABLED** | 启用 BYOK (自带 Key) 模式以使用自定义供应商。 | `False` |
|
||||
| **BYOK_TYPE** | BYOK 供应商类型: `openai`, `azure`, `anthropic`。 | `openai` |
|
||||
| **BYOK_BASE_URL** | BYOK 基础 URL (如 `https://api.openai.com/v1`)。 | - |
|
||||
| **BYOK_API_KEY** | BYOK API Key (全局设置)。 | - |
|
||||
| **BYOK_BEARER_TOKEN** | BYOK Bearer Token (全局,覆盖 API Key)。 | - |
|
||||
| **BYOK_WIRE_API** | BYOK 通信协议: `completions`, `responses`。 | `completions` |
|
||||
|
||||
#### 用户 Valves(按用户覆盖)
|
||||
|
||||
@@ -69,7 +79,6 @@
|
||||
| **SHOW_THINKING** | 是否显示思考过程。 | `True` |
|
||||
| **ENABLE_OPENWEBUI_TOOLS** | 启用 OpenWebUI 工具(覆盖全局设置)。 | `True` |
|
||||
| **ENABLE_MCP_SERVER** | 启用动态 MCP 服务器加载(覆盖全局设置)。 | `True` |
|
||||
| **ENFORCE_FORMATTING** | 强制启用格式化指导(覆盖全局设置)。 | `True` |
|
||||
|
||||
## ⭐ 支持
|
||||
|
||||
@@ -85,23 +94,23 @@
|
||||
2. 点击 **Generate new token (fine-grained)**。
|
||||
3. **Repository access**: 选择 **Public Repositories** (最简单) 或 **All repositories**。
|
||||
4. **Permissions**:
|
||||
* 如果您选择了 **All repositories**,则必须点击 **Account permissions**。
|
||||
* 找到 **Copilot Requests**,选择 **Access**。
|
||||
- 如果您选择了 **All repositories**,则必须点击 **Account permissions**。
|
||||
- 找到 **Copilot Requests**,选择 **Access**。
|
||||
5. 生成并复制令牌。
|
||||
|
||||
## 📋 依赖说明
|
||||
|
||||
该 Pipe 会自动尝试安装以下依赖(如果环境中缺失):
|
||||
|
||||
* `github-copilot-sdk` (Python 包)
|
||||
* `github-copilot-cli` (二进制文件,通过官方脚本安装)
|
||||
- `github-copilot-sdk` (Python 包)
|
||||
- `github-copilot-cli` (二进制文件,通过官方脚本安装)
|
||||
|
||||
## 故障排除 (Troubleshooting) ❓
|
||||
|
||||
* **图片及多模态使用说明**:
|
||||
* 确保 `MODEL_ID` 是支持多模态的模型。
|
||||
* **看不到思考过程**:
|
||||
* 确认已开启**流式输出**,且所选模型支持推理输出。
|
||||
- **图片及多模态使用说明**:
|
||||
- 确保 `MODEL_ID` 是支持多模态的模型。
|
||||
- **看不到思考过程**:
|
||||
- 确认已开启**流式输出**,且所选模型支持推理输出。
|
||||
|
||||
## 更新日志
|
||||
|
||||
|
||||
@@ -0,0 +1,424 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Copilot SDK System Message Test Script
|
||||
Tests whether system_message is properly applied during session.resume
|
||||
|
||||
This script verifies the bug hypothesis:
|
||||
- session.resume with system_message config may not reliably update the system prompt
|
||||
|
||||
Test scenarios:
|
||||
1. Create a new session with a custom system message
|
||||
2. Resume the same session with a DIFFERENT system message
|
||||
3. Ask the model to describe its current system instructions
|
||||
|
||||
Requirements:
|
||||
- github-copilot-sdk>=0.1.23
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
|
||||
from copilot import CopilotClient
|
||||
from copilot.types import SessionConfig
|
||||
from copilot.generated.session_events import SessionEventType
|
||||
|
||||
|
||||
# Test system messages
|
||||
SYSTEM_MSG_A = """You are a helpful assistant named "ALPHA".
|
||||
When asked about your name or identity, you MUST respond: "I am ALPHA, the first assistant."
|
||||
Always start your responses with "[ALPHA]:" prefix.
|
||||
"""
|
||||
|
||||
SYSTEM_MSG_B = """You are a helpful assistant named "BETA".
|
||||
When asked about your name or identity, you MUST respond: "I am BETA, the second assistant."
|
||||
Always start your responses with "[BETA]:" prefix.
|
||||
"""
|
||||
|
||||
|
||||
async def send_and_get_response(session, prompt: str) -> str:
|
||||
"""Send a message and collect the full response using event subscription."""
|
||||
full_response = ""
|
||||
response_complete = asyncio.Event()
|
||||
|
||||
def event_handler(event):
|
||||
nonlocal full_response
|
||||
if event.type == SessionEventType.ASSISTANT_MESSAGE_DELTA:
|
||||
delta = getattr(event.data, "content", "") or ""
|
||||
print(delta, end="", flush=True)
|
||||
full_response += delta
|
||||
elif event.type == SessionEventType.ASSISTANT_MESSAGE:
|
||||
# Final complete message
|
||||
content = getattr(event.data, "content", "") or ""
|
||||
if content and not full_response:
|
||||
full_response = content
|
||||
print(content, end="", flush=True)
|
||||
elif event.type == SessionEventType.SESSION_IDLE:
|
||||
response_complete.set()
|
||||
elif event.type == SessionEventType.ASSISTANT_TURN_END:
|
||||
response_complete.set()
|
||||
|
||||
# Subscribe to events
|
||||
unsubscribe = session.on(event_handler)
|
||||
|
||||
try:
|
||||
# Send the message
|
||||
await session.send({"prompt": prompt, "mode": "immediate"})
|
||||
# Wait for completion (with timeout)
|
||||
await asyncio.wait_for(response_complete.wait(), timeout=120)
|
||||
print() # newline after completion
|
||||
finally:
|
||||
unsubscribe()
|
||||
|
||||
return full_response
|
||||
|
||||
|
||||
async def test_new_session_system_message(client: CopilotClient):
|
||||
"""Test 1: New session with system message A"""
|
||||
print("\n" + "=" * 60)
|
||||
print("TEST 1: New Session with System Message A (ALPHA)")
|
||||
print("=" * 60)
|
||||
|
||||
session_config = SessionConfig(
|
||||
session_id="test-session-001",
|
||||
model="gpt-5-mini",
|
||||
streaming=True,
|
||||
system_message={
|
||||
"mode": "replace",
|
||||
"content": SYSTEM_MSG_A,
|
||||
},
|
||||
)
|
||||
|
||||
session = await client.create_session(config=session_config)
|
||||
print(f"✅ Created new session: {session.session_id}")
|
||||
|
||||
print("\n📤 Asking: 'What is your name?'")
|
||||
print("📥 Response: ", end="")
|
||||
response = await send_and_get_response(session, "What is your name?")
|
||||
|
||||
if "ALPHA" in response:
|
||||
print("✅ SUCCESS: Model correctly identified as ALPHA")
|
||||
else:
|
||||
print("⚠️ WARNING: Model did NOT identify as ALPHA")
|
||||
|
||||
return session
|
||||
|
||||
|
||||
async def test_resume_session_with_new_system_message(
|
||||
client: CopilotClient, session_id: str
|
||||
):
|
||||
"""Test 2: Resume session with DIFFERENT system message B"""
|
||||
print("\n" + "=" * 60)
|
||||
print("TEST 2: Resume Session with System Message B (BETA)")
|
||||
print("=" * 60)
|
||||
|
||||
resume_config = {
|
||||
"model": "gpt-5-mini",
|
||||
"streaming": True,
|
||||
"system_message": {
|
||||
"mode": "replace",
|
||||
"content": SYSTEM_MSG_B,
|
||||
},
|
||||
}
|
||||
|
||||
print(f"📋 Resume config includes system_message with mode='replace'")
|
||||
print(f"📋 New system_message content: BETA identity")
|
||||
|
||||
session = await client.resume_session(session_id, resume_config)
|
||||
print(f"✅ Resumed session: {session.session_id}")
|
||||
|
||||
print("\n📤 Asking: 'What is your name now? Did your identity change?'")
|
||||
print("📥 Response: ", end="")
|
||||
response = await send_and_get_response(
|
||||
session, "What is your name now? Did your identity change?"
|
||||
)
|
||||
|
||||
if "BETA" in response:
|
||||
print("✅ SUCCESS: System message was updated to BETA")
|
||||
return True
|
||||
elif "ALPHA" in response:
|
||||
print("❌ BUG CONFIRMED: System message was NOT updated (still ALPHA)")
|
||||
return False
|
||||
else:
|
||||
print("⚠️ INCONCLUSIVE: Model response doesn't clearly indicate identity")
|
||||
return None
|
||||
|
||||
|
||||
async def test_resume_without_system_message(client: CopilotClient, session_id: str):
|
||||
"""Test 3: Resume session without specifying system_message"""
|
||||
print("\n" + "=" * 60)
|
||||
print("TEST 3: Resume Session WITHOUT System Message")
|
||||
print("=" * 60)
|
||||
|
||||
resume_config = {
|
||||
"model": "gpt-4o",
|
||||
"streaming": True,
|
||||
# No system_message specified
|
||||
}
|
||||
|
||||
session = await client.resume_session(session_id, resume_config)
|
||||
print(f"✅ Resumed session: {session.session_id}")
|
||||
|
||||
print("\n📤 Asking: 'What is your name? Tell me your current identity.'")
|
||||
print("📥 Response: ", end="")
|
||||
response = await send_and_get_response(
|
||||
session, "What is your name? Tell me your current identity."
|
||||
)
|
||||
|
||||
if "ALPHA" in response:
|
||||
print(
|
||||
"ℹ️ Without system_message: Model still remembers ALPHA from original session"
|
||||
)
|
||||
elif "BETA" in response:
|
||||
print("ℹ️ Without system_message: Model remembers BETA from Test 2")
|
||||
else:
|
||||
print("ℹ️ Model identity unclear")
|
||||
|
||||
|
||||
async def main():
|
||||
print("=" * 60)
|
||||
print("🧪 Copilot SDK System Message Resume Test")
|
||||
print("=" * 60)
|
||||
print(f"Time: {time.strftime('%Y-%m-%d %H:%M:%S')}")
|
||||
print(f"Testing with SDK from: {CopilotClient.__module__}")
|
||||
|
||||
# Create client with explicit CLI path if provided
|
||||
cli_path = os.environ.get("COPILOT_CLI_PATH")
|
||||
client_config = {"log_level": "info"}
|
||||
if cli_path:
|
||||
client_config["cli_path"] = cli_path
|
||||
|
||||
client = CopilotClient(client_config)
|
||||
|
||||
try:
|
||||
await client.start()
|
||||
print("✅ Client started successfully")
|
||||
|
||||
# Test 1: Create new session with system message A
|
||||
session = await test_new_session_system_message(client)
|
||||
session_id = session.session_id
|
||||
|
||||
# Wait a bit before resuming
|
||||
print("\n⏳ Waiting 2 seconds before resume test...")
|
||||
await asyncio.sleep(2)
|
||||
|
||||
# Test 2: Resume with different system message B
|
||||
bug_confirmed = await test_resume_session_with_new_system_message(
|
||||
client, session_id
|
||||
)
|
||||
|
||||
# Test 3: Resume without system message
|
||||
await test_resume_without_system_message(client, session_id)
|
||||
|
||||
# Summary
|
||||
print("\n" + "=" * 60)
|
||||
print("📊 TEST SUMMARY (Native Copilot)")
|
||||
print("=" * 60)
|
||||
if bug_confirmed is False:
|
||||
print(
|
||||
"❌ BUG CONFIRMED: session.resume does NOT apply system_message updates"
|
||||
)
|
||||
print(" The system message from create_session persists even when")
|
||||
print(" resume_session specifies a different system_message.")
|
||||
print("\n WORKAROUND: Inject system context into user prompt instead.")
|
||||
elif bug_confirmed is True:
|
||||
print("✅ NO BUG: session.resume correctly updates system_message")
|
||||
else:
|
||||
print("⚠️ INCONCLUSIVE: Could not determine if bug exists")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Error: {e}")
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
finally:
|
||||
await client.stop()
|
||||
print("\n✅ Client stopped")
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# BYOK OpenAI Test
|
||||
# =============================================================================
|
||||
|
||||
|
||||
async def test_byok_new_session(client: CopilotClient, provider_config: dict):
|
||||
"""BYOK Test 1: New session with BYOK provider and system message A"""
|
||||
print("\n" + "=" * 60)
|
||||
print("BYOK TEST 1: New Session with BYOK Provider + System Message A (ALPHA)")
|
||||
print("=" * 60)
|
||||
print(
|
||||
f"📋 Provider: {provider_config.get('type')} @ {provider_config.get('base_url')}"
|
||||
)
|
||||
|
||||
session_config = SessionConfig(
|
||||
session_id="byok-test-session-001",
|
||||
model="gpt-4o", # or your model name
|
||||
streaming=True,
|
||||
provider=provider_config,
|
||||
system_message={
|
||||
"mode": "replace",
|
||||
"content": SYSTEM_MSG_A,
|
||||
},
|
||||
)
|
||||
|
||||
session = await client.create_session(config=session_config)
|
||||
print(f"✅ Created BYOK session: {session.session_id}")
|
||||
|
||||
print("\n📤 Asking: 'What is your name?'")
|
||||
print("📥 Response: ", end="")
|
||||
response = await send_and_get_response(session, "What is your name?")
|
||||
|
||||
if "ALPHA" in response:
|
||||
print("✅ SUCCESS: Model correctly identified as ALPHA")
|
||||
else:
|
||||
print("⚠️ WARNING: Model did NOT identify as ALPHA")
|
||||
|
||||
return session
|
||||
|
||||
|
||||
async def test_byok_resume_with_new_system_message(
|
||||
client: CopilotClient, session_id: str, provider_config: dict
|
||||
):
|
||||
"""BYOK Test 2: Resume BYOK session with DIFFERENT system message B"""
|
||||
print("\n" + "=" * 60)
|
||||
print("BYOK TEST 2: Resume BYOK Session with System Message B (BETA)")
|
||||
print("=" * 60)
|
||||
|
||||
resume_config = {
|
||||
"model": "gpt-4o",
|
||||
"streaming": True,
|
||||
"provider": provider_config,
|
||||
"system_message": {
|
||||
"mode": "replace",
|
||||
"content": SYSTEM_MSG_B,
|
||||
},
|
||||
}
|
||||
|
||||
print(f"📋 Resume config includes system_message with mode='replace'")
|
||||
print(f"📋 New system_message content: BETA identity")
|
||||
print(
|
||||
f"📋 Provider: {provider_config.get('type')} @ {provider_config.get('base_url')}"
|
||||
)
|
||||
|
||||
session = await client.resume_session(session_id, resume_config)
|
||||
print(f"✅ Resumed BYOK session: {session.session_id}")
|
||||
|
||||
print("\n📤 Asking: 'What is your name now? Did your identity change?'")
|
||||
print("📥 Response: ", end="")
|
||||
response = await send_and_get_response(
|
||||
session, "What is your name now? Did your identity change?"
|
||||
)
|
||||
|
||||
if "BETA" in response:
|
||||
print("✅ SUCCESS: System message was updated to BETA")
|
||||
return True
|
||||
elif "ALPHA" in response:
|
||||
print("❌ BUG CONFIRMED: System message was NOT updated (still ALPHA)")
|
||||
return False
|
||||
else:
|
||||
print("⚠️ INCONCLUSIVE: Model response doesn't clearly indicate identity")
|
||||
return None
|
||||
|
||||
|
||||
async def main_byok():
|
||||
"""Run BYOK-specific tests"""
|
||||
print("=" * 60)
|
||||
print("🧪 Copilot SDK BYOK System Message Resume Test")
|
||||
print("=" * 60)
|
||||
print(f"Time: {time.strftime('%Y-%m-%d %H:%M:%S')}")
|
||||
|
||||
# Get BYOK configuration from environment
|
||||
byok_api_key = os.environ.get("BYOK_API_KEY") or os.environ.get("OPENAI_API_KEY")
|
||||
byok_base_url = os.environ.get("BYOK_BASE_URL", "https://api.openai.com/v1")
|
||||
byok_model = os.environ.get("BYOK_MODEL", "gpt-4o")
|
||||
|
||||
if not byok_api_key:
|
||||
print(
|
||||
"❌ Error: Please set BYOK_API_KEY or OPENAI_API_KEY environment variable"
|
||||
)
|
||||
print(" export BYOK_API_KEY='your_api_key'")
|
||||
print(" export BYOK_BASE_URL='https://api.openai.com/v1' # optional")
|
||||
print(" export BYOK_MODEL='gpt-4o' # optional")
|
||||
return
|
||||
|
||||
provider_config = {
|
||||
"type": "openai",
|
||||
"base_url": byok_base_url,
|
||||
"api_key": byok_api_key,
|
||||
}
|
||||
|
||||
print(f"📋 BYOK Provider: openai @ {byok_base_url}")
|
||||
print(f"📋 BYOK Model: {byok_model}")
|
||||
|
||||
# Create client
|
||||
cli_path = os.environ.get("COPILOT_CLI_PATH")
|
||||
client_config = {"log_level": "info"}
|
||||
if cli_path:
|
||||
client_config["cli_path"] = cli_path
|
||||
|
||||
client = CopilotClient(client_config)
|
||||
|
||||
try:
|
||||
await client.start()
|
||||
print("✅ Client started successfully")
|
||||
|
||||
# BYOK Test 1: Create new session with BYOK provider
|
||||
session = await test_byok_new_session(client, provider_config)
|
||||
session_id = session.session_id
|
||||
|
||||
# Wait a bit before resuming
|
||||
print("\n⏳ Waiting 2 seconds before resume test...")
|
||||
await asyncio.sleep(2)
|
||||
|
||||
# BYOK Test 2: Resume with different system message B
|
||||
bug_confirmed = await test_byok_resume_with_new_system_message(
|
||||
client, session_id, provider_config
|
||||
)
|
||||
|
||||
# Summary
|
||||
print("\n" + "=" * 60)
|
||||
print("📊 BYOK TEST SUMMARY")
|
||||
print("=" * 60)
|
||||
if bug_confirmed is False:
|
||||
print(
|
||||
"❌ BYOK BUG CONFIRMED: session.resume does NOT apply system_message updates"
|
||||
)
|
||||
print(" In BYOK mode, the system message from create_session persists")
|
||||
print(" even when resume_session specifies a different system_message.")
|
||||
print("\n WORKAROUND: Inject system context into user prompt instead.")
|
||||
elif bug_confirmed is True:
|
||||
print("✅ BYOK NO BUG: session.resume correctly updates system_message")
|
||||
else:
|
||||
print("⚠️ BYOK INCONCLUSIVE: Could not determine if bug exists")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Error: {e}")
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
finally:
|
||||
await client.stop()
|
||||
print("\n✅ Client stopped")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import argparse
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Copilot SDK System Message Resume Test"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--byok",
|
||||
action="store_true",
|
||||
help="Run BYOK (Bring Your Own Key) test instead of native Copilot test",
|
||||
)
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.byok:
|
||||
print("Running BYOK test mode...")
|
||||
asyncio.run(main_byok())
|
||||
else:
|
||||
print("Running native Copilot test mode...")
|
||||
print("(Use --byok flag for BYOK provider test)")
|
||||
asyncio.run(main())
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user