🚀 Xinference v2.2.0 Release Notes
✅ Highlights
- 🧠 Next-Gen LLM Support
- GLM-5
- Kimi-K2.5
- MiniMax-M2.5
- Qwen3.5
🌐 Community Edition Updates
📦 Installation
- Via pip:
pip install 'xinference==2.2.0'
- Via Docker: Pull the latest image, or update using pip inside the container.
🆕 New Model Support
- GLM-5
- Kimi-K2.5
- MiniMax-M2.5
- Qwen3.5
✨ New Features
- Supported GLM-5 and Kimi-K2.5 running on the vLLM engine.
- Updated related model configurations.
🐞 Bug Fixes
- Fixed the multi-file processing issue in
create_image_edits.
- Replaced 55 bare
except clauses to improve exception handling standardization.
📚 Documentation Updates
- Updated v2.1.0 documentation.
- Added Docker pull instructions to the README.
🏢 Enterprise Edition Updates
- 🔧 Enhanced PPU Support
Optimized execution and scheduling capabilities in PPU environments, improving the stability and performance of enterprise-level deployments.
🤖 XAgent v0.1.2
- Added PPT generation capabilities, and optimized frontend user experience along with overall stability.