Introduction
Nomik gives your AI coding assistant a permanent, searchable memory of your codebase — so it stops guessing and starts knowing.
What is Nomik?
Nomik makes your AI coding assistant actually understand your codebase.
Without Nomik, every time you ask your AI a question about your code, it reads files from scratch — like flipping through a book looking for a page it has never bookmarked. It guesses at connections between files, loses track of what calls what, and sometimes just makes things up.
With Nomik, your codebase is mapped once into a permanent searchable graph. Your AI queries that map instead of reading raw files. It knows:
- Which functions are calling each other
- Which routes talk to which database tables
- What would break if you renamed a function or changed a schema
- Where all your external APIs, queues, and cron jobs are
Think of it like GPS for your codebase. Instead of reading every road sign from scratch each trip, the map is already built — your AI just looks up the answer.
The Problem Nomik Solves
AI assistants have no memory of your codebase between conversations. Every single chat starts from zero, which leads to:
- Incomplete answers — the AI can't read 300 files at once, so it misses things
- Wrong answers — it fills in gaps with confident-sounding guesses
- No cross-file reasoning — tracing a bug across 5 files is nearly impossible
- Missed dependencies — it doesn't know that changing your database schema will break a cron job running at midnight
What changes with Nomik
"What happens if I change the payments table schema?"
AI reads 47 files → runs out of context → misses the cron job → guesses"What happens if I change the payments table schema?"
AI queries the graph → gets exact results in milliseconds:
WRITES_TO payments:
processPayment (src/services/payment.ts:45)
handleRefund (src/services/refund.ts:12)
Triggered by:
CronJob: monthly_billing → generateReport (schedule: "0 0 1 * *")
→ AI knows exactly what to update and what will breakQuick Start
npm install -g @nomik-ai/cli # Install
nomik init # Start Neo4j + create project
nomik scan . # Build the knowledge graph
nomik setup-cursor # Connect to Cursor (or Windsurf/Claude)Then ask your AI assistant:
"What functions write to the users table?"
The AI will query the graph and return precise results with file paths and line numbers.
Platform at a Glance
| Metric | Value |
|---|---|
| Languages | TypeScript, JavaScript, Python, Rust, Markdown, SQL, C# + YAML, Terraform, GraphQL, Dockerfile, .env, JSON configs |
| Parser extractors | 37 (code, API, data, infrastructure, config, security) |
| Graph node types | 17 |
| Graph edge types | 19 |
| MCP tools | 21 |
| CLI commands | 38 |
| Supported editors | Cursor, Windsurf, Claude Desktop, Antigravity |
How Data Flows
- Parse — Tree-sitter reads your source code and extracts 37 types of entities. Config files (.env, Dockerfile, YAML, Terraform, GraphQL, JSON) are parsed via a dedicated regex-based config parser
- Store — Everything is saved into a Neo4j graph database with deduplication (same env var in
.envandprocess.env= one node) - Sync — File watcher keeps the graph up to date as you code. Deleted files are automatically cleaned from the graph
- Query — Your AI queries the graph via MCP tools, or you use the CLI directly
What Nomik Tracks
All extractors are import-aware — they resolve receiver variables from actual imports, not hardcoded names. This means Nomik correctly identifies db.select() as a Prisma call because it traced db back to import { PrismaClient } from '@prisma/client'.
Code
// Functions → FunctionNode (name, params, returnType, async, exported, bodyHash)
export async function processPayment(orderId: string): Promise<void> {
// Call chains → CALLS edges (with confidence scoring)
const order = await orderService.getOrder(orderId);
// DB operations → READS_FROM / WRITES_TO edges
await prisma.payment.create({ data: { orderId, amount: order.total } });
// External API → CALLS_EXTERNAL edge
await stripe.charges.create({ amount: order.total });
// Event → EMITS edge
eventBus.emit('payment.completed', { orderId });
// Metric → USES_METRIC edge
paymentCounter.inc({ status: 'success' });
}From this single function, Nomik creates:
- 1
FunctionNodewith name, file, line range, params, async flag, bodyHash - 1
CALLSedge toorderService.getOrder(confidence: 0.95) - 1
WRITES_TOedge toDBTable:payment - 1
CALLS_EXTERNALedge toExternalAPI:stripe - 1
EMITSedge toEvent:payment.completed - 1
USES_METRICedge toMetric:paymentCounter
Supported Frameworks and Libraries
Nomik detects these automatically from your imports — no configuration needed.
Routes: Express, Fastify, NestJS, tRPC, gRPC, GraphQL (type-graphql, @nestjs/graphql), Next.js, Nuxt
Databases: Prisma, Supabase, Knex, TypeORM, Drizzle, pg, mysql2, raw SQL, Entity Framework (C#), Django/Alembic (Python)
Redis: ioredis, @redis/client, @upstash/redis
Job Queues: Bull, BullMQ, Bee-Queue, Agenda, pg-boss
HTTP Clients: axios, got, ky, node-fetch, ofetch, undici, superagent, fetch(), $fetch()
Message Brokers: KafkaJS, amqplib (RabbitMQ), NATS, AWS SQS/SNS, Google Pub/Sub
Tracing: @opentelemetry/api, dd-trace (Datadog), @sentry/node
Metrics: prom-client, @opentelemetry/api (Counter, Gauge, Histogram, Summary)
WebSockets: ws, @nestjs/websockets, uWebSockets.js, Socket.io (with room/namespace detection)
Cron: node-cron, node-schedule, @nestjs/schedule, Agenda, Bree
Feature Flags: LaunchDarkly, Unleash, Flagsmith, Split.io, GrowthBook
Config: Docker, docker-compose, Kubernetes manifests, Terraform (.tf), CloudFormation/SAM, GitHub Actions, GitLab CI, OpenAPI/Swagger specs, GraphQL schemas (.graphql/.gql), package.json, requirements.txt, .env files
Security: Hardcoded secrets (AWS keys, GitHub tokens, Stripe keys, Slack tokens, JWT, private keys, basic auth URLs)
Python Support
# Functions with typed parameters (excludes self/cls)
async def process_order(order_id: str, db: Session) -> Order:
# Celery tasks → QueueJob nodes
send_notification.delay(order_id)
# Redis → DBTable nodes (schema=redis)
redis_client.set(f"order:{order_id}", json.dumps(data))
# Prometheus → Metric nodes
orders_processed.inc()
# OpenTelemetry → Span nodes
with tracer.start_span("process_order"):
passRust Support
// Functions, structs, enums, traits, use declarations, calls
pub async fn handle_request(req: Request) -> Response {
let user = db::get_user(req.user_id).await;
let response = process(user).await;
response
}Real-World Example: Onboard Output
$ nomik onboard
📋 Codebase Briefing — my-api (scanned 2026-02-20)
Stats:
2,847 functions across 312 files
15 routes (8 GET, 4 POST, 2 PUT, 1 DELETE)
3 DB tables: users, listings, messages
5 external APIs: Stripe, SendGrid, Algolia, S3, Datadog
12 environment variables (3 required, 9 optional)
2 cron jobs: monthly_billing, daily_cleanup
Language Distribution:
TypeScript: 89% (278 files)
Python: 8% (25 files)
SQL: 3% (9 files)
Health:
Dead code: 0 functions
God files: 3 (socket.ts, listing.controller.ts, user.service.ts)
Duplicates: 2 groups
Security: 0 issues
High-Risk Functions (most callers):
processPayment 23 callers across 8 files
validateUser 19 callers across 6 files
sendNotification 15 callers across 5 filesTechnology Stack
| Component | Technology | Purpose |
|---|---|---|
| Language | TypeScript (ESM, strict mode) | All 8 packages |
| Graph database | Neo4j 5 Community | Persistent knowledge graph storage |
| Parser | Tree-sitter | Multi-language AST extraction (7 grammars) |
| AI protocol | MCP SDK 1.26.0 | 21 tools, 9 resources, 6 prompts, sampling, roles |
| CLI framework | Commander.js | 38 commands with tsup standalone bundle |
| Monorepo | Turborepo + pnpm workspaces | 8 packages with strict boundaries |
| 2D Visualization | Cytoscape.js | Interactive graph with 4 layouts |
| 3D Visualization | Three.js / 3d-force-graph | DNA-style 3D graph with animated particles |
| Validation | Zod | Runtime type checking for all config |
| Logging | Pino | Structured JSON logging |
| File watching | Chokidar | 500ms debounce, incremental reindex |
| Tests | Vitest | 232+ tests across 18+ files |
Next Steps
Getting Started
Install, scan, and connect your AI editor in under 5 minutes.
CLI Reference
All 38 commands with real examples and expected output.
MCP Tools
21 tools, 9 resources, and 6 prompts — complete reference with examples.
Architecture
8 packages, data flow pipeline, query modules, and design principles.
Graph Schema
17 node types, 19 edge types, Cypher examples, and indexes.
Parser Extractors
All 37 extractors across 7 languages with code examples.