This MCP server, developed by Henry Hawke, provides enhanced Titan Memory capabilities for AI agents. Built with TypeScript and leveraging TensorFlow.js, it offers improved context retention and retrieval through neural network-based memory encoding. The implementation focuses on optimizing long-term information storage and recall for conversational AI, enabling more coherent and contextually-aware interactions. It's particularly useful for applications requiring persistent memory across multiple conversations or complex, multi-step tasks where traditional context windows fall short.
아직 리뷰가 없습니다. 첫 번째 리뷰를 작성해 보세요!
대화에 참여하려면 로그인하세요
Get help about available tools. Parameters: tool (optional), category (optional), showExamples (optional), verbose (optional)
Initialize the Titan Memory model with custom configuration. Parameters: inputDim, hiddenDim, memoryDim, transformerLayers, numHeads, ffDimension, dropoutRate, maxSequenceLength, memorySlots, similarityThreshold, surpriseDecay, pruningInterval, gradientClip
Perform a forward pass through the model to get predictions. Parameters: x (input vector or text), memoryState (optional)
Execute a training step to update the model. Parameters: x_t (current input vector or text), x_next (next input vector or text)
Get the current memory state and statistics. Parameters: type (optional)
Update memory along a manifold direction. Parameters: base, velocity
Remove less relevant memories to free up space. Parameters: threshold (0-1)
Save memory state to a file. Parameters: path (checkpoint file path)
Load memory state from a file. Parameters: path (checkpoint file path)
Reset accumulated gradients to recover from training issues. Parameters: None