Turn your logs into runtime memory.
Semantic, vector-native logging for LLM-ready teams. Capture every event, vectorize what matters, and query with natural language. No instrumentation. Pinecone-native.
๐ง Semantic Runtime Intelligence
Ask questions about your runtime data. Get instant insights for debugging, analytics, and product decisions.
How It Works
Four simple steps to turn your logs into semantic, queryable runtime intelligence.
Install the SDK
Drop in Logvine's SDK and connect your project.
npm install logvine
Log Events Automatically
Logvine hooks into your existing log.info, log.warn, etc.
log.info("user.login", {
userId: "abc123",
plan: "free"
});
Vectorize What Matters
Add vectorization rules by level, event, or regex. Logs with vectorize: true are embedded and stored in Pinecone.
initLogvine({
defaultVectorize: false,
vectorizeRules: [{ level: 'error' }]
});
Query in Natural Language
Ask anything like: 'What led to the onboarding failures yesterday?' โ Get insights, not just logs.
// Ask in plain English
"What led to the onboarding failures yesterday?"
// Get structured insights
โ 67% failed at payment validation
โ 23% encountered API rate limits
โ 10% had browser compatibility issues
Built for Developers, Loved by Infrastructure Teams
Drop it in and get insights instantly. Logvine captures runtime behavior without bloating your stack.
Simple Configuration
initLogvine({
defaultVectorize: false,
redactKeys: ["token", "email"],
vectorizeRules: [{ level: "error" }]
});
What Teams Are Saying
"Logvine is like having an AI memory layer for your backend logs."โ Future You
"We caught 3 revenue-blocking bugs in under an hour."โ Probably your next user
"I debugged a semantic error with one query. Unreal."โ Beta Developer