We’re thrilled to announce that with the release of Mattermost v10, the Mattermost Copilot plugin is now v1.0 (GA)! The release also features a major enhancement to our Copilot plugin: Multi-LLM support with custom prompts. This new functionality empowers enterprises to customize AI workflows with their choice of off-the-shelf, private, or self-hosted models. This offers greater flexibility, privacy, and control over your AI-driven tasks. Let’s dive in.
Key Highlights:
- Multi-LLM Support: Integrate and use multiple language models (LLMs) at the same time in your Mattermost environment. We support third-party models like GPT-4o and Claude Sonnet 3.5 or private, self-hosted ones like Meta Llama 3.1.
- Custom Prompts: Configure prompts for each AI assistant in Mattermost, allowing administrators to quickly set up and align multiple bots to their purposes.
- Enhanced Flexibility: Easily switch between the models and prompts for different scenarios via all UX touchpoints with the LLM in Mattermost. Balance third-party models for general tasks with private models for sensitive data, ensuring the best mix of performance and security.
- Optimized Privacy and Security: Use on-premise or self-hosted models to maintain the highest standards of data security.
- Usage Metrics: track usage and adoption of your LLMs within Mattermost
- Vision Support: Multi-modal vision-capable LLMs can now be configured and utilized in Mattermost with images attached to messages.
Get Started:
Ready to see Mattermost Copilot multi-LLM support in action? Our new Academy course guides you through the process of connecting OpenAI, Azure OpenAI, Anthropic, and Meta models to Mattermost.
And remember, Mattermost Copilot is open source, so be sure to check out the GitHub repository and be a part of its development by requesting features and opening issues!
For more details, check out:
- V10 blog post: Enhancing mission-critical enterprise collaboration with multi-LLM support for Mattermost Copilot - Mattermost
- Copilot landing page: https://mattermost.com/copilot/
- Product documentation: https://docs.mattermost.com/configure/enable-copilot.html
- GitHub repository: GitHub - mattermost/mattermost-plugin-ai: Mattermost Copilot plugin supporting multiple LLMs
Join us in pushing the boundaries of collaboration with AI! Comment below with your feedback or reach out to us at fastfutures@mattermost.com.