Alibaba's AgentScope team has released CoPaw, an open-source personal AI assistant framework that runs on your own hardware or in the cloud. Licensed under Apache 2.0, it launched in late February 2026 and positions itself as an alternative to Anthropic's OpenClaw for developers who want full control over their agent stack.
The core pitch: CoPaw isn't just a chatbot wrapper. It's a modular workstation built on three layers: AgentScope for agent logic, AgentScope Runtime for execution, and a memory module called ReMe that gives agents persistent context across sessions. ReMe uses a hybrid retrieval system combining vector search and BM25 keyword matching, so the assistant actually remembers previous conversations. Memory is file-based and portable, which means you can inspect, edit, or migrate it.
Local deployment works through Ollama or llama.cpp, with MLX support for Apple Silicon. No API keys needed if you go fully offline. Cloud-side, it supports Alibaba's Qwen model family alongside GPT and Llama-based models. The official docs list DingTalk, Feishu, Discord, QQ, iMessage, and Slack as supported channels, all managed through a unified protocol.
Developers extend CoPaw by dropping Python scripts into a skills directory, following the anthropics/skills spec. Built-in cron scheduling lets agents run tasks autonomously. It's ambitious middleware, though the repo sits at roughly 100 GitHub stars so far, so real-world adoption is still early.
Bottom Line
CoPaw gives developers a self-hosted agent framework with persistent memory and multi-channel support, but with only ~100 GitHub stars, it's still proving itself.
Quick Facts
- Released: late February 2026 by Alibaba's AgentScope team (Tongyi Lab)
- License: Apache 2.0
- Local LLM support: Ollama, llama.cpp, MLX (Apple Silicon)
- Supported channels: Discord, iMessage, DingTalk, Feishu, QQ, Slack
- GitHub stars: ~101 (as of mid-March 2026)




