Cloud & Open Source

Choose the MemOS "Memory" solution that best suits your needs.
Tip
Before writing your first line of code, you can quickly experience the effects of "Memory Capabilities" via MemOS Playground.
  • No Installation Required: Open directly in your browser to use immediately
  • Real Interaction: Chat just like with a standard Chatbot, but the system will automatically remember what you've said
  • Visualized Memory: See exactly what content is processed into memories, and how they are scheduled and recalled
πŸ‘‰ Try Playground Now

1. The MemOS Solution Best Suited for You

MemOS provides two "Memory" solutions for AI applications. You can choose:

  • MemOS Cloud Platform β€” Simplify development, manage with ease: Use cloud services with one click in under 5 minutes, suitable for rapid AI application building and iteration.
  • MemOS Open Source β€” Self-hosted, fully controllable: Deploy to your own environment, develop secondarily, and integrate deeply according to business needs.

Whether it's Cloud Service or Open Source Framework, MemOS allows your AI to easily gain persistent memory.

You can start with the cloud service for a quick experience, then switch to localized deployment based on business needs.


2. Selection Guide

Choose MemOS Cloud Platform

  • Rapid Implementation: Enable your AI application with built-in memory in just a few minutes. Focus on business logic and feature implementation without spending time maintaining complex storage and memory management systems.
  • Zero-Cost Verification: Provides ample free trial quotas to help you verify solution feasibility and product effects at the lowest cost.
  • Logs & Monitoring: View call logs in real-time in the console, obtain complete call chain analysis, facilitating debugging, monitoring, and performance optimization.
  • Advanced Features: Knowledge base and continuous dialogue capabilities are fully open via API, enabling more flexible customization and deep integration.

Choose MemOS Open Source

  • Data Security: All components are deployed in your own environment, ensuring data is fully controllable, meeting localized deployment and privacy compliance requirements.
  • Custom Configuration: Freely choose LLM providers, inference backends, deployment strategies, etc., achieving higher flexibility and controllability.
  • Code Extension: Modify the codebase directly, extend custom features as needed, and contribute improvements back to the community.

3. Still Undecided?