Memoripy is a powerful Python library that simplifies managing and retrieving memory interactions. Designed for AI-driven applications, it excels with short-term and long-term storage, contextual retrieval, and advanced clustering techniques. Integrate seamlessly with OpenAI and Ollama APIs to enhance memory management in your projects.
Memoripy is an innovative Python library tailored for the management and retrieval of context-aware memory interactions. Whether you're building AI-driven applications or just need robust memory management, Memoripy provides essential features designed with versatility and efficiency in mind. This library supports seamless integration with OpenAI and Ollama APIs, making it a perfect choice for developers looking to enhance AI functionalities.
Key Features
-
Short-term and Long-term Memory: Effectively manage and categorize memory as either short-term or long-term, depending on importance and frequency of use.
-
Contextual Memory Retrieval: Harness the power of embeddings and concept recognition to retrieve memories that are not only relevant but also contextually appropriate.
-
Concept Extraction and Embeddings: Utilize advanced models from OpenAI or Ollama to extract concepts and generate meaningful embeddings, enhancing the contextual awareness of your application.
-
Graph-Based Associations: Develop a concept graph for your memories, employing spreading activation techniques for retrieval based on relevance rather than just chronology.
-
Hierarchical Clustering: Organize similar memories into semantic groups, facilitating more accurate and contextually aligned retrieval.
-
Decay and Reinforcement Mechanisms: Implement a dynamic memory management system where older memories gradually decay, while frequently accessed memories are reinforced to remain readily available.
Example Usage
To illustrate the capabilities of Memoripy, consider the following example script that sets up a memory management system:
from memoripy import MemoryManager, JSONStorage
def main():
# API key for OpenAI
api_key = "your-key"
if not api_key:
raise ValueError("Please set your OpenAI API key.")
# Define models for chat and embeddings
chat_model = "openai"
chat_model_name = "gpt-4o-mini"
embedding_model = "ollama"
embedding_model_name = "mxbai-embed-large"
# Choose storage method
storage_option = JSONStorage("interaction_history.json")
# Initialize MemoryManager
memory_manager = MemoryManager(
api_key=api_key,
chat_model=chat_model,
chat_model_name=chat_model_name,
embedding_model=embedding_model,
embedding_model_name=embedding_model_name,
storage=storage_option
)
# New user prompt
new_prompt = "My name is Khazar"
# Load recent interactions for context
short_term, _ = memory_manager.load_history()
last_interactions = short_term[-5:] if len(short_term) >= 5 else short_term
# Retrieve relevant past interactions
relevant_interactions = memory_manager.retrieve_relevant_interactions(new_prompt, exclude_last_n=5)
# Generate a response
response = memory_manager.generate_response(new_prompt, last_interactions, relevant_interactions)
print(f"Generated response:\n{response}")
# Extract new concepts and store interaction
combined_text = f"{new_prompt} {response}"
concepts = memory_manager.extract_concepts(combined_text)
new_embedding = memory_manager.get_embedding(combined_text)
memory_manager.add_interaction(new_prompt, response, new_embedding, concepts)
if __name__ == "__main__":
main()
By utilizing Memoripy, developers can create systems that not only remember but also understand context, making interactions more meaningful and engaging. Explore the potential of memory management in Python with Memoripy, your dedicated library for advanced memory interactions!