Empowering AI with a 'Trustworthy Mind': Memory Tensor Pioneers a New Path Amidst Tech Titans' Race

11/28 2025 380

This article represents the 981st original piece by Deep Dive Atom.

From Parameter Rivalry to Memory Mastery: The AI Industry's Next Frontier

Yang Xiaoxian | Author

Deep Dive Atom Studio | Editor

In our increasingly AI-centric world, you might have encountered this scenario: telling an AI, "I have a sensitive stomach, so steer clear of spicy dishes," only to have it suggest Chongqing noodles shortly after you switch topics. When businesses deploy AI for customer service, introducing new policies or guidelines necessitates time for the AI to update its knowledge base, often leading to a spike in customer dissatisfaction during this transitional period.

While industry giants like OpenAI and Google are heavily invested in expanding parameter scales and advancing multimodal capabilities, a startup named Memory Tensor has charted a distinct course by focusing on AI's 'most deficient memory.' By reimagining the memory operating system as fundamental AI infrastructure, Memory Tensor addresses a strategic void in China's AGI infrastructure landscape. With its core strength in 'enabling AI to remember accurately and utilize memories effectively,' it has carved out a unique technological path and established an ecological edge.

In less than two years, Memory Tensor has forged a differentiated path amidst stiff competition from tech behemoths. Not only has it secured nearly 100 million yuan in angel round funding, but it has also attracted top-tier clients such as China Telecom and China Merchants Securities. Today, it stands as an emerging force in the AI arena that cannot be overlooked.

On November 27, 2025, Memory Tensor unveiled MemOS-MindDock, a cross-platform AI unified memory management assistant. It facilitates seamless cross-platform synchronization, enabling the transfer of historical dialogues, personal settings, and memory content from platforms like ChatGPT and Gemini, thus achieving controllable and manageable unified memory assets.

A Multidisciplinary Team Tackles AI's 'Critical Shortcoming'—the Governable Memory Layer

As the AI industry transitions from the 'multimodal perception' phase to the 'deep cognition' phase, the 'memory capacity' of AI Agents in commercial deployment has emerged as a pivotal factor determining intelligent experiences. Traditional solutions, however, are plagued by four main challenges: sluggish memory generation, weak management, difficult migration, and integration hurdles.

'After AI gains the ability to see, hear, and speak, what it lacks most is 'memory,'' said CEO Xiong Feiyu, highlighting the core motivation behind their venture. They observed that current AI systems either possess 'goldfish memory,' forgetting conversations immediately, or 'chaotic memory,' providing irrelevant responses when overloaded. For businesses, updating AI knowledge relies on technicians for repetitive training, which is time-consuming and inefficient. Ordinary users must repeatedly state their preferences, leading to a subpar experience.

'China needs a large model route tailored to its unique context, not just replicating foreign technologies but addressing 'high costs and high hallucinations' from the ground up.' This became Memory Tensor's founding mission—leveraging memory technology to transform AI from 'mere chatting' to 'solving real-world problems.'

Memory Tensor was born from the profound insights of its three core founders in the AI domain. Xiong Feiyu, a 'dual expert' bridging academia and industry, holds a bachelor's degree from Huazhong University of Science and Technology and a Ph.D. from Drexel University in the United States. As the former head of data intelligence for Alibaba's business middleware, he spearheaded the construction of a digital commerce knowledge graph worth hundreds of billions, boosting Taobao and Tmall's annual revenue by over 14 billion yuan, and deeply understands the core pain points of technology implementation. CTO Li Zhiyu, with a Ph.D. from Renmin University, previously led marketing algorithms and user knowledge graph development at Alibaba and Xiaohongshu, supporting Double 11 promotions and influencer matching, generating billions in revenue and precisely grasping AI commercialization. Meanwhile, the team's chief scientist, Dr. Yang Hongkang, is a mathematical prodigy who completed his Ph.D. at Princeton in two and a half years, with profound expertise in the foundational theory of machine learning and the proposer of the Memory Cube innovative architecture for large models.

Memory Tensor's confidence stems from its team's unique blend of academic rigor and industry acumen. With 88% of members holding master's or doctoral degrees from top institutions like Princeton, Peking University, and Jiaotong University, and experience at major companies like Alibaba and Meituan, the team comprehends both cutting-edge AI theory and the practical needs of businesses and users. Such an 'academic team capable of execution' is rare in the industry.

Backed by Memory Tensor's robust team, its innovative approach stems from redefining the essence of AI memory—elevating 'memory' from a functional layer to a 'governable infrastructure layer.' It created MemOS, the world's first large model memory operating system with full-link governance capabilities, enabling AI's memory capacity to achieve a leap 'from nothing to something, from something to excellence, from excellence to governance.'

Technological Revolution: MemOS Defines the New AI Memory Infrastructure

MemOS's technological breakthroughs were not achieved overnight. In 2024, Memory Tensor introduced the Recollection³ (Memory³) 1.0 architecture at WAIC, reducing the hallucination rate by 33% and implementation costs by 75% through hierarchical memory design, which externalized part of the model's knowledge for storage. It achieved top scores on hallucination assessment datasets like TruthfulQA, quickly drawing attention from international teams at Meta and Google.

In 2025, the launch of MemOS, the world's first large model memory operating system, formalized the implementation of a governable memory layer. Drawing inspiration from the hierarchical design of traditional operating systems, MemOS constructs a full-link closed loop from user interaction to permission control through a three-dimensional architecture of 'API interface layer + memory scheduling layer + storage infrastructure layer,' providing AI memory with a unified 'management hub':

The interface layer supports seamless conversion between natural language instructions and structured operations, enabling rapid implementation of needs like 'remember my dietary restrictions' or 'update compliance rules.'

The scheduling layer ensures precise and efficient memory calls through multi-granularity strategies and asynchronous loading mechanisms, reducing first-token latency by over 70%.

The storage layer employs graph databases and hierarchical storage (plaintext memory, activated memory, parameter memory) to support memory persistence and cross-platform migration.

The most ingenious innovation of this technology is that MemOS completely transforms the form of memory. MemCube acts like standardized 'memory capsules,' encapsulating various types of information such as user preferences and dialogue contexts, supporting flexible migration across models and devices. Meanwhile, built-in governance controls include time-bound strategies, permission controls, privacy protection, and watermarking services. Enterprises can set knowledge update timelines, users can autonomously manage memory access permissions, and administrators can audit memory flow throughout its lifecycle, achieving controllable and manageable memory across its entire existence. Additionally, MemOS further ensures consistency and accuracy during cross-model collaboration through memory transfer pipelines and purification pipelines, making the implementation of 'large model capabilities at small model costs' possible.

In deployment, MemOS transcends functional upgrades to transform AI's production relationships, akin to how 'cloud operating systems revolutionized enterprise deployment models.' Leveraging cloud services, MemOS achieved 5.26% higher accuracy than competitors in the LoCoMo memory evaluation, with core metrics ranking at the forefront of the industry. In China Merchants Securities' financial large model project, MemOS enabled AI to instantly remember newly issued stock market policies without repeated training, boosting response speed by 70%.

On November 27, Memory Tensor launched MemOS-MindDock, which for the first time decouples AI memory from platform capabilities. Through 'decoupling memory from platforms,' it empowers users to own their memory assets, constructing a sustainable, platform-independent user-side memory system. While cross-platform synchronization is its outward expression, its core purpose is to create a 'never-lost, continuously appreciating, and fully autonomous' memory asset for users, ultimately achieving a fundamental leap from 'fragmented AI interactions' to 'continuous intelligent experiences.'

MemOS transforms the AI industry from 'ephemeral thinking experiences' into reusable assets for the first time, enabling intelligent agents to possess long-term intelligent efficiency rather than one-time computational behaviors. MemOS-MindDock converts memory data scattered across AI platforms into core assets that users can control, accumulate, and own, constructing a truly user-centric memory accumulation mechanism. As AI memory technology matures, the AI industry is shifting from competing on computational power to competing on long-term intelligent efficiency.

Commercial Implementation: Leveraging AI Memory Infrastructure to Unlock Full-Scene Value

Alongside technological breakthroughs, Memory Tensor has also achieved remarkable commercial success, building a commercialization closed loop of 'B-end deep cultivation + C-end incubation + ecological co-construction.'

In the B-end sector, Memory Tensor has established a pattern of 'top-tier clients + flagship projects,' with clients spanning key industries such as finance, telecommunications, and media. In addition to China Merchants Securities and China Telecom, it has partnered with Bank of China to implement projects like financial industry large models and customer service large models. Since its inception, Memory Tensor has secured cumulative contract values exceeding tens of millions of yuan. In 2025, it completed nearly 100 million yuan in angel round financing, backed by renowned institutions such as Futeng Capital and CICC Capital.

For ordinary users, technological implementation scenarios are equally close to daily life. In the C-end market, Memory Tensor has opened new space through 'rapid incubation + viral growth.' Addressing consumer-level user needs, Memory Tensor formed an MVP team to accelerate the incubation of applications like interview coaching and thesis assistance. These 'small but precise' C-end products not only allow ordinary users to intuitively experience the value of memory AI but also provide real-world scenario feedback for technological iteration, forming a virtuous cycle where 'B-end technology supports C-end innovation, and C-end data feeds back into B-end optimization.'

For ordinary users, the value of memory infrastructure is intuitively validated through the MemOS-MindDock product. This C-end application is not merely a scenario-based tool but the core verification carrier for MemOS as a 'personal memory OS' for consumer use. Its core value lies in proving the feasibility of 'memory compounding'—when users employ functions like interview assistants or thesis helpers, the AI continuously accumulates personal learning habits, professional preferences, and expression styles as memory assets rather than starting from scratch in every conversation. As usage deepens, these memories form an exclusive 'personal experience library,' enabling AI to provide more precise and coherent services in subsequent job search coaching, academic writing, or daily health management and life planning, allowing users to genuinely feel the compounding effect of 'memory becoming more valuable with use.'

At the ecological level, Memory Tensor has built an industry collaboration network through 'open-source + developer programs.' In July 2025, Memory Tensor open-sourced MemOS and launched the 'Memory Reconstruction Program,' providing developers with resource support, engineering templates, and high-quality memory models while inviting global developers to jointly build the next generation of AI Agents. Developers can utilize memory extraction, compression, recall, and migration functions as easily as calling APIs without needing to understand underlying mechanisms, enabling the construction of intelligent agents with long-term preferences, stable settings, and cross-task states.

To make 'memory' a foundational capability in the AGI era, akin to databases and vector libraries, Memory Tensor has constructed a four-in-one ecosystem of 'open-source community + cloud services + specific chains + ecological partners' around MemOS. Particularly, to enable MemOS to rapidly reach more enterprises through cloud services and achieve technological scalability, Memory Tensor proposed a 'platform capability building + ecological joint expansion' model, collaborating deeply with Alibaba Cloud, Huawei Cloud, and China Telecom Research Institute to extend MemOS into fields like healthcare, education, and industry, forming an ecological closed loop of 'technology - developers - scenarios.'

Cloud services and MemOS-MindDock, the two key products, not only serve as critical carriers for MemOS technology implementation but also act as core pillars of the developer ecosystem, mutually reinforcing Memory Tensor's industrial collaboration ecosystem from both enterprise and user ends.

Meanwhile, industry acclaim has followed suit. Memory Tensor's team has secured accolades, including a spot in the WAIC 2024 Innovation Roadshow TOP 20. Its core technologies have been featured by authoritative media outlets such as MIT Technology Review and Synced, establishing it as a benchmark in the field of AI memory enhancement. Following the official open-sourcing of MemOS in July 2025, it has drawn a substantial number of developers, paving the way for collaborative ecological development.

Conclusion: Transitioning from Parameter Rivalry to Memory Governance—The AI Industry's Next Frontier

The AI industry is undergoing a transformation from 'broad perception' to 'deep cognition.' While multimodality tackles 'seeing and hearing,' a governable memory infrastructure is crucial for determining whether AI can 'think, operate safely, controllably, and efficiently.' In 2025, tech behemoths like OpenAI and Google are increasingly directing their attention towards memory functions. Sequoia Capital has even identified 'AI memory' as the industry's fifth pillar, highlighting the strategic significance of this sector.

As 'memory evolves into a system resource,' the AI industry's competitive landscape will shift from parameter scale to 'long-term intelligent efficiency competition'—akin to how e-commerce transitioned from competing on Gross Merchandise Volume (GMV) to Return on Investment (ROI). In the future, enterprises will vie not based on model parameter size but on the efficiency, security, and reusability of memory governance. Memory Tensor distinguishes itself in this emerging sector.

Unlike traditional external retrieval systems or short-term conversational memory, MemOS pioneers the concept of 'governance' as the core attribute of memory infrastructure. By incorporating auditing, rollback, migration, and sharing capabilities, it addresses the industry's longstanding challenge of AI memory being 'useful yet difficult to manage,' filling a void in the domestic market. The introduction of MemOS-MindDock further extends this governance capability from enterprise settings to consumer applications, demonstrating the commercial viability of 'memory compounding' through practical experiences like 'cross-platform memory interoperability + personal memory controllability.' It also enables ordinary users to intuitively perceive the transformations brought about by infrastructure upgrades and innovations, ensuring that every AI interaction can inherit past cognitive accumulations.

For the entire industry, Memory Tensor's exploration holds profound implications. It demonstrates that Chinese AI enterprises can not only follow international technological trajectories but also achieve breakthroughs in fundamental infrastructure principles, forging an innovative path tailored to China's unique circumstances, characterized by 'low cost, low hallucination, and governability.' When AI is equipped with a reliable and objective memory infrastructure, whether it's intelligent customer service for enterprises, financial risk control, or the personalized AI experiences users enjoy through MemOS-MindDock, a dual revolution in efficiency and security will unfold. Memory Tensor stands at the vanguard of this revolution, reshaping AI value with memory infrastructure and propelling the intelligent era towards a deeper, more secure, and controllable cognitive new realm.

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.