45
grade D
3 days ago
glama

vLLM MCP Server

Exposes vLLM capabilities to AI assistants, enabling chat completions, model management, and platform-aware container control with automatic detection of Docker/Podman and GPU availability across Linux, macOS, and Windows.

Install from

M8ven verifies MCPs across every public registry — install directly from whichever one you prefer.

// key findings
No credential exfiltration, no sensitive file access, no obfuscation
Static analysis found nothing flowing your secrets to unexpected places.
Open source with a license and README
Anyone can audit the code, the license is declared, and the publisher documents what it does.
// full audit trail
The full breakdown of what we checked, the deductions that landed, the network hosts, the dependency advisories, and concrete fix guidance is available to verified publishers.
// improvement guidance — verified publishers only
We have 1 concrete improvement we can share with the publisher of this MCP. Each comes with specific guidance to raise the trust score.
// embed badge in your README
[![M8ven Score](https://m8ven.ai/badge/mcp/micytao-vllm-mcp-server-vwnh9k)](https://m8ven.ai/mcp/micytao-vllm-mcp-server-vwnh9k)
commit: 08c7179ce66663b40ba4454a0b9f5e07c8b1cd54
code hash: 893f6e831f9432054cd8dff2b2e3f98161f0bf0a8fff8a713211ef79519d4da3
verified: 4/18/2026, 6:41:30 PM
view raw JSON →