Sébastien Blanc’s Post

Recently langchain4j-memory-store-redis was added to the langchain4j Quarkusio extension. Basically it uses Redis instead of the RAM to keep the chat context during a conversation. But you might think : What could be the use case ? Here is one : Imagine a serverless chatbot pod (so scaling back to 0) but that needs to keep the context ! Also in case it's needed it can scale up and keep track of multiple sessions/contexts You can see in this short screencast that even after that the pod scaled back to 0, it was able to remember my name at the second request. Some cool stuff about this demo (blog will follow) : - I'm using of course Quarkus/langchain4j compiled natively with GraalVM - I'm using the openshift sandbox because it's free and it has the operators I need - It's using my favourite project around the k8s ecosystem : Knative - For the Redis part, I'm using the blazing fast DragonflyDB running on Aiven

Frederico Garcia Costa

Technology Specialist | Solutions Architect | Technical Leader

7mo

An infinispan memory store would be great!

To view or add a comment, sign in

Explore topics