docs: fix broken relative links in example cases to resolve mkdocs build warnings
This commit is contained in:
@@ -17,15 +17,16 @@ This case study demonstrates how to use the **GitHub Copilot SDK Pipe** with the
|
||||
- **Plugin Type**: Pipe (GitHub Copilot SDK)
|
||||
- **Base Model**: Minimax 2.1 (via Pipe integration)
|
||||
- **Key Capabilities**:
|
||||
- **File Processing**: Automatically reads and parses multiple CSV data files.
|
||||
- **Code Generation & Execution**: On-the-fly Python scripting to calculate growth rates, conversion rates, and median trends.
|
||||
- **Multimodal Output**: Generates Markdown reports, interactive HTML dashboards, and Mermaid timeline charts.
|
||||
- **File Processing**: Automatically reads and parses multiple CSV data files.
|
||||
- **Code Generation & Execution**: On-the-fly Python scripting to calculate growth rates, conversion rates, and median trends.
|
||||
- **Multimodal Output**: Generates Markdown reports, interactive HTML dashboards, and Mermaid timeline charts.
|
||||
|
||||
---
|
||||
|
||||
## 💬 Conversation Highlights
|
||||
|
||||
### 📥 Import Conversation
|
||||
|
||||
You can download the raw chat data and import it into your Open WebUI to see the full tool calls and analysis logic:
|
||||
[:material-download: Download Chat JSON](./star-prediction-chat.json)
|
||||
|
||||
@@ -33,22 +34,28 @@ You can download the raw chat data and import it into your Open WebUI to see the
|
||||
> In Open WebUI, click your **User Avatar** (bottom of left sidebar) -> **Settings** -> **Data** -> **Import Chats**, then select the downloaded file.
|
||||
|
||||
### 1. Data Submission
|
||||
|
||||
The **User** provided traffic source distribution and uploaded:
|
||||
|
||||
- `Unique visitors in last 14 days.csv`
|
||||
- `Total views in last 14 days.csv`
|
||||
- `star-history.csv`
|
||||
|
||||
### 2. Analysis Execution
|
||||
|
||||
**Minimax 2.1** received the data and immediately formulated an analysis plan:
|
||||
|
||||
1. Calculate star growth trajectory and rates.
|
||||
2. Analyze visitor-to-star conversion rates.
|
||||
3. Build linear and median projection models.
|
||||
4. Generate a milestone timeline.
|
||||
|
||||
### 3. Report Generation
|
||||
|
||||
The model produced a comprehensive report. Here are the core projections:
|
||||
|
||||
#### 🎯 Key Projections
|
||||
|
||||
| Metric | Value | Insight |
|
||||
| :--- | :--- | :--- |
|
||||
| **Current Stars** | 62 | 62% of the goal reached |
|
||||
@@ -88,4 +95,4 @@ gantt
|
||||
|
||||
---
|
||||
|
||||
> [View GitHub Copilot SDK Pipe Source Code](../../../plugins/pipes/github-copilot-sdk/README.md)
|
||||
> [View GitHub Copilot SDK Pipe Documentation](./github-copilot-sdk.md)
|
||||
|
||||
@@ -17,15 +17,16 @@
|
||||
- **插件类型**: Pipe (GitHub Copilot SDK)
|
||||
- **底层模型**: Minimax 2.1 (通过 Pipe 接入)
|
||||
- **核心能力**:
|
||||
- **文件处理**: 自动读取并解析多份 CSV 数据文件。
|
||||
- **代码生成与执行**: 现场编写 Python 分析代码并执行,计算增长率、转化率及中位趋势。
|
||||
- **多模态输出**: 生成 Markdown 报告、HTML 交互看板以及 Mermaid 时间轴图表。
|
||||
- **文件处理**: 自动读取并解析多份 CSV 数据文件。
|
||||
- **代码生成与执行**: 现场编写 Python 分析代码并执行,计算增长率、转化率及中位趋势。
|
||||
- **多模态输出**: 生成 Markdown 报告、HTML 交互看板以及 Mermaid 时间轴图表。
|
||||
|
||||
---
|
||||
|
||||
## 💬 对话实录
|
||||
|
||||
### 📥 导入对话记录
|
||||
|
||||
你可以下载原始对话数据并导入到你的 Open WebUI 中,查看完整的工具调用和分析逻辑:
|
||||
[:material-download: 下载原始对话 JSON](./star-prediction-chat.json)
|
||||
|
||||
@@ -33,22 +34,28 @@
|
||||
> 在 Open WebUI 首页点击 **左侧侧边栏底部个人头像** -> **设置** -> **数据** -> **导入记录**,选择下载的文件即可。
|
||||
|
||||
### 1. 提交原始数据
|
||||
|
||||
**用户**提供了项目的流量来源分布表,并上传了:
|
||||
|
||||
- `Unique visitors in last 14 days.csv`
|
||||
- `Total views in last 14 days.csv`
|
||||
- `star-history.csv`
|
||||
|
||||
### 2. 模型执行分析
|
||||
|
||||
**Minimax 2.1** 接收到数据后,立即制定了分析计划:
|
||||
|
||||
1. 计算 Star 增长轨迹和增长率。
|
||||
2. 分析访问者到 Star 的转化率。
|
||||
3. 构建线性与中位增长模型进行预测。
|
||||
4. 生成里程碑时间轴。
|
||||
|
||||
### 3. 生成分析报告
|
||||
|
||||
模型输出了一份详尽的报告,以下是其核心预测:
|
||||
|
||||
#### 🎯 关键预测结果
|
||||
|
||||
| 指标 | 数值 | 洞察 |
|
||||
| :--- | :--- | :--- |
|
||||
| **当前 Star 数** | 62 | 已完成目标的 62% |
|
||||
@@ -88,4 +95,4 @@ gantt
|
||||
|
||||
---
|
||||
|
||||
> [查看 GitHub Copilot SDK Pipe 源码](../../../plugins/pipes/github-copilot-sdk/README.md)
|
||||
> [查看 GitHub Copilot SDK Pipe 开发文档](./github-copilot-sdk.zh.md)
|
||||
|
||||
@@ -17,17 +17,18 @@ This case study demonstrates how to use the **GitHub Copilot SDK Pipe** with **M
|
||||
- **Plugin Type**: Pipe (GitHub Copilot SDK)
|
||||
- **Base Model**: Minimax 2.1
|
||||
- **Key Capabilities**:
|
||||
- **System Tool Access**: Automatically detects and invokes `ffmpeg` within the container.
|
||||
- **Two-Pass Optimization**:
|
||||
- **System Tool Access**: Automatically detects and invokes `ffmpeg` within the container.
|
||||
- **Two-Pass Optimization**:
|
||||
1. **Pass 1**: Analyzes all frames to generate a custom 256-color palette (`palettegen`).
|
||||
2. **Pass 2**: Applies the palette for superior quantization and dithering (`paletteuse`).
|
||||
- **Precision Parameters**: Implements 1.4x PTS scaling, Lanczos scaling, and 20fps rate control.
|
||||
- **Precision Parameters**: Implements 1.4x PTS scaling, Lanczos scaling, and 20fps rate control.
|
||||
|
||||
---
|
||||
|
||||
## 💬 Conversation Highlights
|
||||
|
||||
### 📥 Import Conversation
|
||||
|
||||
You can download the raw chat data and import it into your Open WebUI to see how the model debugs and optimizes the FFmpeg parameters:
|
||||
[:material-download: Download Chat JSON](./video-processing-chat.json)
|
||||
|
||||
@@ -35,14 +36,18 @@ You can download the raw chat data and import it into your Open WebUI to see how
|
||||
> In Open WebUI, click your **User Avatar** (bottom of left sidebar) -> **Settings** -> **Data** -> **Import Chats**, then select the downloaded file.
|
||||
|
||||
### 1. Processing Requirements
|
||||
|
||||
The **User** provided an input file and specific parameters:
|
||||
|
||||
- Speed: 1.4x (setpts=PTS/1.4)
|
||||
- Resolution: 1280px width, auto height
|
||||
- Optimization: Must use palette generation technology
|
||||
- Constraint: File size < 20MB
|
||||
|
||||
### 2. Analysis Execution
|
||||
|
||||
**Minimax 2.1** generated and executed the following core logic:
|
||||
|
||||
```bash
|
||||
# Pass 1: Generate optimal palette
|
||||
ffmpeg -i input.mov -vf "fps=20,scale=1280:-1:flags=lanczos,setpts=PTS/1.4,palettegen" palette.png
|
||||
@@ -51,6 +56,7 @@ ffmpeg -i input.mov -i palette.png -lavfi "[0:v]fps=20,scale=1280:-1:flags=lancz
|
||||
```
|
||||
|
||||
### 3. Result Summary
|
||||
|
||||
| Metric | Original Video | Processed GIF | Status |
|
||||
| :--- | :--- | :--- | :--- |
|
||||
| **File Size** | 38 MB | **14 MB** | ✅ Success |
|
||||
@@ -63,6 +69,7 @@ ffmpeg -i input.mov -i palette.png -lavfi "[0:v]fps=20,scale=1280:-1:flags=lancz
|
||||
## 💡 Why This Case Matters
|
||||
|
||||
Standard LLMs can only "tell you" how to use FFmpeg. However, a Pipe powered by the **GitHub Copilot SDK** can:
|
||||
|
||||
1. **Interpret** complex multimedia processing parameters.
|
||||
2. **Access** raw files within the filesystem.
|
||||
3. **Execute** resource-intensive binary tool tasks.
|
||||
@@ -70,4 +77,4 @@ Standard LLMs can only "tell you" how to use FFmpeg. However, a Pipe powered by
|
||||
|
||||
---
|
||||
|
||||
> [View GitHub Copilot SDK Pipe Source Code](../../../plugins/pipes/github-copilot-sdk/README.md)
|
||||
> [View GitHub Copilot SDK Pipe Documentation](./github-copilot-sdk.md)
|
||||
|
||||
@@ -17,17 +17,18 @@
|
||||
- **插件类型**: Pipe (GitHub Copilot SDK)
|
||||
- **底层模型**: Minimax 2.1
|
||||
- **核心能力**:
|
||||
- **底层系统访问**: 自动检测并调用容器内的 `ffmpeg` 工具。
|
||||
- **双阶段优化 (Two-Pass Optimization)**:
|
||||
- **底层系统访问**: 自动检测并调用容器内的 `ffmpeg` 工具。
|
||||
- **双阶段优化 (Two-Pass Optimization)**:
|
||||
1. **阶段一**: 分析全视频帧生成 256 色最优调色板 (`palettegen`)。
|
||||
2. **阶段二**: 应用调色板进行高质量量化和抖动处理 (`paletteuse`)。
|
||||
- **参数精准控制**: 实现 1.4 倍 PTS 缩放、Lanczos 滤镜缩放及 20fps 帧率控制。
|
||||
- **参数精准控制**: 实现 1.4 倍 PTS 缩放、Lanczos 滤镜缩放及 20fps 帧率控制。
|
||||
|
||||
---
|
||||
|
||||
## 💬 对话实录
|
||||
|
||||
### 📥 导入对话记录
|
||||
|
||||
你可以下载原始对话数据并导入到你的 Open WebUI 中,查看模型如何一步步调试 FFmpeg 参数:
|
||||
[:material-download: 下载原始对话 JSON](./video-processing-chat.json)
|
||||
|
||||
@@ -35,14 +36,18 @@
|
||||
> 在 Open WebUI 首页点击 **左侧侧边栏底部个人头像** -> **设置** -> **数据** -> **导入记录**,选择下载的文件即可。
|
||||
|
||||
### 1. 提交处理需求
|
||||
|
||||
**用户**指定了输入文件和详细参数:
|
||||
|
||||
- 加速:1.4x (setpts=PTS/1.4)
|
||||
- 分辨率:宽度 1280px,等比例缩放
|
||||
- 质量优化:必须使用调色板生成技术
|
||||
- 约束:文件体积 < 20MB
|
||||
|
||||
### 2. 模型执行处理
|
||||
|
||||
**Minimax 2.1** 自动编写并执行了以下核心逻辑:
|
||||
|
||||
```bash
|
||||
# 生成优化调色板
|
||||
ffmpeg -i input.mov -vf "fps=20,scale=1280:-1:flags=lanczos,setpts=PTS/1.4,palettegen" palette.png
|
||||
@@ -51,6 +56,7 @@ ffmpeg -i input.mov -i palette.png -lavfi "[0:v]fps=20,scale=1280:-1:flags=lancz
|
||||
```
|
||||
|
||||
### 3. 处理结果摘要
|
||||
|
||||
| 指标 | 原始视频 | 处理后 GIF | 状态 |
|
||||
| :--- | :--- | :--- | :--- |
|
||||
| **文件大小** | 38 MB | **14 MB** | ✅ 达标 |
|
||||
@@ -63,6 +69,7 @@ ffmpeg -i input.mov -i palette.png -lavfi "[0:v]fps=20,scale=1280:-1:flags=lancz
|
||||
## 💡 为什么这个案例很有意义?
|
||||
|
||||
传统的 LLM 只能“告诉你”怎么做,而基于 **GitHub Copilot SDK** 的 Pipe 能够:
|
||||
|
||||
1. **理解** 复杂的多媒体处理参数。
|
||||
2. **感知** 文件系统中的原始素材。
|
||||
3. **执行** 耗时、耗能的二进制工具任务。
|
||||
@@ -70,4 +77,4 @@ ffmpeg -i input.mov -i palette.png -lavfi "[0:v]fps=20,scale=1280:-1:flags=lancz
|
||||
|
||||
---
|
||||
|
||||
> [查看 GitHub Copilot SDK Pipe 源码](../../../plugins/pipes/github-copilot-sdk/README.md)
|
||||
> [查看 GitHub Copilot SDK Pipe 开发文档](./github-copilot-sdk.zh.md)
|
||||
|
||||
Reference in New Issue
Block a user