Knowledge & Memories Agent for Bookstore
Multilevel hierarchical knowledge and memories are built with LangChain.
Summary
-
The client operates a bookstore with various titles spanning genres, authors, and eras. This store is not just a commercial entity but a hub for literary enthusiasts, providing curated collections, rare editions, and a space for readers to delve deep into the world of literature.
-
The client envisioned a Conversational Agent that would serve as a knowledgeable companion for customers. This agent would possess in-depth knowledge of the content of books on sale and a vast understanding of classic literature. Beyond just answering queries, the agent should recommend books based on user questions. Furthermore, it would have a 'memory' of past interactions, allowing it to offer personalized recommendations based on previous conversations with the user.
-
A system was built using entirely API-based LLMs. The main LLMs for generation and embeddings were GPT-3.5 and ada-002. We used Pinecone for knowledge & memory storage. This combination allowed the Conversational Agent to provide insightful book recommendations and maintain interaction continuity.
Tech Stack
Delivery Timeline
TECH CHALLENGE
-
Books, by their very nature, contain layered and multifaceted information. The challenge was to design a system that could understand and navigate this hierarchy, from high-level themes and plot summaries to intricate details like character motivations or specific events. The agent should be able to delve into any layer of a book's content based on the user's query.
-
Beyond just knowing the events of a book, the agent has to comprehend the personalities of the main characters. This means understanding their motivations, relationships, growth arcs, and how they react in various situations. Deep, nuanced knowledge is essential for answering questions about character traits or predicting hypothetical scenarios.
-
The agent is expected to understand individual books and draw comparisons between them. Whether grouping books by similar themes, comparing character arcs across different novels, or ranking events based on their significance, the system has to be adept at comparative literary analysis.
-
Users might ask the agent to rank books or characters based on criteria such as moral complexity, romance, or suspense. This required the agent's flexible understanding, allowing it to rank concepts based on varying user-defined criteria dynamically.
-
One of the most challenging aspects is ensuring the agent remembers past interactions. This 'memory' would allow it to provide contextually relevant recommendations, building on previous conversations. Implementing such continuity in a conversational agent, especially dealing with vast literary data, is a significant technical hurdle.
SOLUTION
-
Addressing the complex hierarchical knowledge challenge began with LangChain, a framework designed to prototype conversational agents rapidly. This allowed the team to quickly iterate and refine the agent's capabilities, ensuring it could navigate the intricate layers of book content, from overarching themes to minute details. We've used a couple of "map-reduce" techniques. Reduce being summarization that was relatively quickly adapted and iterated over with LangChain. Similarly, past conversation history is transformed into memories.
-
GPT-3.5 powered the core of the Conversational Agent's knowledge and response generation. Its expansive knowledge base was crucial for character personality derivation and comparative analysis. Meanwhile, the ada-002 algorithm created embeddings, enabling the agent to understand, compare, group, and dynamically rank literary concepts based on user-defined criteria.
-
Pinecone was employed to manage the vast literary data and ensure the agent could quickly and accurately retrieve relevant information. This vector database was instrumental in storing the hierarchical knowledge, providing the agent could seamlessly delve into any depth of a book's content.
-
Since all the processing is purely API based, we could leverage serverless architecture due to having any state within the code and no "heavy" computations. So we've leveraged AWS serverless architecture toolbox to optimize the cloud cost, have 100% usage-based charging, and be able to scale 1000x at any load spike.
IMPACT
-
The implementation of the Conversational Agent transformed the bookstore's customer engagement. Readers now had an intelligent literary companion, guiding them through the vast world of books with personalized recommendations and deep insights.
-
With the agent's ability to remember past interactions and provide tailored book suggestions, customers felt a deeper connection to the store. This personalized touch increased sales and ensured customers returned, eager to continue their literary journey with the agent's guidance.
-
Integrating an advanced AI-driven solution solidified the bookstore's reputation as a forward-thinking and innovative establishment. This attracted tech-savvy readers and set the store apart in a competitive market, drawing the attention of literary enthusiasts and technology lovers.
Want to make your knowledge base talk?
Talk to Yuliya. She will make sure that all is covered. Don't waste time on googling - get all answers from relevant expert in under one hour.