I posted an early version of this a couple of weeks ago and got some great feedback (and 3 forks!) from engineers dealing with vector DB bloat.
The Core Thesis: We don't need Vector DBs for local AI memory. They are overkill (O(log n)) and expensive.
Remember-Me uses a Coherent State Network Protocol (CSNP) based on Optimal Transport theory (Wasserstein Distance) to achieve O(1) retrieval latency without the massive index overhead.
Benchmarks vs ChromaDB:
Latency: 12ms (CSNP) vs 45ms (Vector)
Cost: $0.06/GB vs $2.40/GB
I'm looking for people to try breaking the 'Zero-Hallucination' guarantee on the new release.
OP here.
I posted an early version of this a couple of weeks ago and got some great feedback (and 3 forks!) from engineers dealing with vector DB bloat.
The Core Thesis: We don't need Vector DBs for local AI memory. They are overkill (O(log n)) and expensive.
Remember-Me uses a Coherent State Network Protocol (CSNP) based on Optimal Transport theory (Wasserstein Distance) to achieve O(1) retrieval latency without the massive index overhead.
Benchmarks vs ChromaDB:
Latency: 12ms (CSNP) vs 45ms (Vector)
Cost: $0.06/GB vs $2.40/GB
I'm looking for people to try breaking the 'Zero-Hallucination' guarantee on the new release.
Roast my code