Lior Sinclair’s Post

View profile for Lior Sinclair, graphic
Lior Sinclair Lior Sinclair is an Influencer

Covering the latest in AI R&D • ML-Engineer • MIT Lecturer • Building AlphaSignal, a newsletter read by 200,000+ AI engineers.

Mem0 gained 20,000 stars on Github in 30 days. It's a new memory layer for LLMs that allows you to directly add, update, and search memories in your models. It's crucial for AI systems that require persistent context, like customer support and personalized recommendations. Features: ▸ Multi-Level Memory ▸ Adaptive Personalization ▸ Developer-Friendly API ▸ Cross-Platform Consistency ↓ Are you technical? Check out https://AlphaSignal.ai to get daily email of breakthrough models, repos and papers in AI. Read by 180,000+ devs.

  • graphical user interface, text, application
Lior Sinclair

Covering the latest in AI R&D • ML-Engineer • MIT Lecturer • Building AlphaSignal, a newsletter read by 200,000+ AI engineers.

4mo
Maciej Wolski

Human-like Agents | Closing the gap between AI and REAL Intelligence

4mo

The amount of stars on GitHub proves the demand. However, given the quality of research around AI episodic memory, it is unlikely that people will use this in the long term. I don't want to spoil the excitement, but I have built something like this for LLM projects I am involved in, quite a long time ago. It is too basic to stop here...

Subham Kundu

Principal AI Engineer at HTCD, AI-First Cloud Security | Knowledge Graphs | Winner of 10+ Hackathons

4mo

The best thing about mem0 is their codebase. It's very well maintained. Under the hood it works like 1. They have a SQL lite database 2. They store the memories in there 3. Use function calling to understand when to retrieve or add memory. The best thing is on how they have unified function calling for all LLM providers into one format.

Matt Moderwell

Founder @ Ouro. Designing cognitive architectures for AI agents. Everyone is an entrepreneur. Working on human-AI collaboration to solve the big problems.

4mo

This is really cool, looking forward to testing it out. Is there any functionality to create use case specific contexts from memory? Something to change the context created based on user intent. For example, working with a chatbot on code iteration might only need memory of the last couple code blocks and a summary of changes, but a exploratory conversation might be better with a full summary of the conversation.

Like
Reply
Vincent Granville

AI/LLM Disruptive Leader | GenAI Tech Lab

4mo

Here is how we handle context and customization: https://mltblog.com/3WcTS9C

Like
Reply

Mem0 enhances AI systems by providing a new memory layer for LLMs, allowing direct addition, updating, and searching of memories. This is crucial for AI systems requiring persistent context, such as customer support and personalized recommendations. Key features include multi-level memory, adaptive personalization, a developer-friendly API, and cross-platform consistency.

Like
Reply
Lindsay Richman

Founder, Innerverse AI | McKinsey Alum | Google for Startups | VentureBeat Top Woman in AI

4mo

Lior S. Is there an efficient way to load memory with this package? Right now I am simply clean and append all interactions to memory with a script when grafting; the data essentially becomes a LC prompt. I’m able to work with this pretty efficiently by using voice commands (or text queries), which enable agents to retrieve data from their knowledge and memory. One thing I do like about this is that the metadata appears to have assigned categories within it, which might be helpful down the line!

Victory Adugbo

Hacking Growth for AI, Web3, and FinTech Companies || Blockchain Instructor at CCHUB || Building Smarter Futures for CohorteAI || Turning AI Chaos into Business Success Stories ||

4mo

Persistent context is important for AI because it allows systems to retain and recall past interactions, enhancing decision-making, contextual understanding, and personalization. This is crucial for applications like customer support and personalized recommendations, where maintaining a continuous and coherent user experience is essential.

Like
Reply

Amazing innovation! Mem0's memory layer for LLMs is a game-changer in AI systems. The features are impressive and developer-friendly. Well done! Lior S.

Eugenio Schiavoni

Data Scientist At Coderio

4mo

Love this post. Everything I needed for this weekend. For more than a year I have thought that a memory implementation of this type would be great. And here it is! I think it has infinite uses, to customize llms for personal uses, for clients, and the most interesting thing is that the llm has its own personal memory, maybe some pipeline that automates saving the llm's memories. One more step to reach AGI, little by little it is equaling human cognitive processing. It is still a long way off, but very well on its way. Thank you so much for sharing this!

Like
Reply
See more comments

To view or add a comment, sign in

Explore topics