MCP Protocol Integration
Vectora implements Model Context Protocol (MCP), a standardized protocol that allows any IDE with MCP support to connect and use Vectora as a context server. Works natively in Claude Code and Cursor.
What is MCP?
MCP is a standardized protocol for connecting AI models to external tools. Vectora exposes its functions (search, analysis) as MCP tools that any compatible IDE can use.
IDE (Claude Code / Cursor / other with MCP)
↓
MCP Protocol (JSON-RPC)
↓
Vectora Server (localhost:9090 or remote)
↓
12 Available ToolsQuick Start
Prerequisites
- Node.js 18+
- Vectora installed:
npm install -g @kaffyn/vectora - API keys (Gemini, Voyage)
- IDE with MCP support (Claude Code, Cursor, etc)
Step 1: Initialize Project
cd ~/your-project
vectora init --name "Your Project"Step 2: Configure MCP in Your IDE
Config file (location varies by IDE):
- Claude Code:
~/.claude/claude_desktop_config.json - Cursor:
~/.cursor/cursor_config.json - Other IDEs: See MCP documentation
Add Vectora:
{
"mcpServers": {
"vectora": {
"command": "vectora",
"args": ["mcp"],
"env": {
"GEMINI_API_KEY": "your-value",
"VOYAGE_API_KEY": "your-value",
"VECTORA_NAMESPACE": "your-namespace"
}
}
}
}Step 3: Test
- Restart your IDE
- Look for
search_contexttool in MCP menu - Test:
@vectora search_context "How to validate tokens?"
12 Available Tools
| Tool | Function |
|---|---|
search_context | Semantic search for chunks |
search_tests | Find related tests |
analyze_dependencies | Find function callers |
find_similar_code | Find similar code patterns |
get_file_structure | Summarize file structure |
list_files | List indexed files |
list_namespaces | List namespaces |
get_namespace_stats | Namespace statistics |
index_status | Index status |
reindex | Force re-indexing |
get_config | Get current config |
get_metrics | Execution metrics |
Practical Workflows
Workflow 1: Understand Feature
You: "Explain how authentication works"
IDE: @vectora search_context "authentication"
Vectora: Returns relevant chunks
IDE: Shows chunks in contextWorkflow 2: Debugging
You: "Why does this test fail?"
IDE: @vectora search_context "test X"
IDE: @vectora analyze_dependencies "tested function"
Vectora: Returns relevant contextWorkflow 3: Code Review
You: "Review this function"
IDE: @vectora find_similar_code "your code"
Vectora: Finds similar patterns
IDE: Compares with existing codeAdvanced Configuration
Custom Namespace
{
"mcpServers": {
"vectora": {
"env": {
"VECTORA_NAMESPACE": "staging" // Use different namespace
}
}
}
}Multiple Synchronized IDEs
If using multiple IDEs, both point to same config and namespace:
// Claude Code
{
"mcpServers": {
"vectora": {
"command": "vectora",
"args": ["mcp"],
"env": {
"VECTORA_NAMESPACE": "your-namespace"
}
}
}
}
// Cursor - same config
{
"mcpServers": {
"vectora": {
"command": "vectora",
"args": ["mcp"],
"env": {
"VECTORA_NAMESPACE": "your-namespace"
}
}
}
}Both see the same chunks, indices, namespaces.
Troubleshooting
“Vectora command not found”
# Check installation
npm list -g @kaffyn/vectora
# Reinstall if needed
npm install -g @kaffyn/vectora --force“Connection refused”
Vectora is not running as server:
# Start manually
vectora mcp
# Or via config with custom port
{
"env": {
"VECTORA_MCP_PORT": "9091"
}
}“API key not found”
Check environment variables:
echo $GEMINI_API_KEY
echo $VOYAGE_API_KEY
# If empty, configure in .env or config JSONPerformance
- Expected latency: ~300-500ms (network + APIs)
- Local search: ~100ms (no APIs)
- Cache: Results cached in
.vectora/ - Concurrent: Supports multiple IDEs pointing to same server
Compatible IDEs
| IDE | Support | Status |
|---|---|---|
| Claude Code | Native MCP | Tested |
| Cursor | Native MCP | Tested |
| VS Code | No native MCP | Use own extension |
| Zed | MCP supported | Not tested |
| Neovim | MCP via plugin | Not tested |
For VS Code, use VS Code Extension.
Next Steps
- VS Code? → VS Code Extension
- ChatGPT? → ChatGPT Plugin
- Gemini? → Gemini API
Part of Vectora ecosystem · Open Source (MIT)