An MCP server for efficient code indexing and symbol retrieval using tree-sitter AST parsing to fetch specific functions or classes without loading entire files. It significantly reduces AI token costs by providing O(1) byte-offset access to code components across multiple programming languages.
Install from
M8ven verifies MCPs across every public registry — install directly from whichever one you prefer.
process.env. You'll be asked to provide them before it can run.ANTHROPIC_API_KEY— export =sk-ant-...ASTLLM_LOG_FILE— Log to file instead of stderrASTLLM_LOG_LEVEL— warn Log level: debug, info, warn, errorASTLLM_MAX_FILE_SIZE_KB— 500 Max file size to index (KB)ASTLLM_MAX_INDEX_FILES— 500 Max files to index per repoASTLLM_PERSIST— 0 Persist the index to ~/.astllm/{path}.json after every index, and pre-load it on startup (1 or true to enable)ASTLLM_WATCH— 0 Watch working directory for source file changes and re-index automatically (1 or true to enable). Excluded dirs are never watched — see [Watch excludes](#watch-excludes).BINARY_PATHCODE_INDEX_PATH— ~/.code-index Index storage directoryGEMINI_API_KEYGITHUB_TOKEN— GitHub API token (higher rate limits, private repos)GOOGLE_API_KEY— export =...OPENAI_API_BASEOPENAI_API_KEYOPENAI_BASE_URL— export =http://localhost:11434/v1OPENAI_MODEL— export =llama3[](https://m8ven.ai/mcp/tluyben-astllm-mcp-a3yizy)