Serving Llms Vllm

🌐Community
by orchestra-research · vlatest · Repository

Serves LLMs like VLLM efficiently for experimentation and deployment, streamlining development workflows and scaling AI applications.

Install on your platform

We auto-selected Claude Code based on this skill’s supported platforms.

1

Run in terminal (recommended)

terminal
claude mcp add orchestra-research-serving-llms-vllm npx -- -y @trustedskills/orchestra-research-serving-llms-vllm
2

Or manually add to ~/.claude/settings.json

~/.claude/settings.json
{
  "mcpServers": {
    "orchestra-research-serving-llms-vllm": {
      "command": "npx",
      "args": [
        "-y",
        "@trustedskills/orchestra-research-serving-llms-vllm"
      ]
    }
  }
}

Requires Claude Code (claude CLI). Run claude --version to verify your install.

About This Skill

No functional description, capabilities, or usage details were provided in the source content. The input contains only HTML metadata and structural code without any explanatory text about the orchestra-research-serving-llms-vllm skill. Consequently, specific capabilities, real-world scenarios, example prompts, or technical tips cannot be generated without hallucinating information not present in the source.

Tags

🛡️

TrustedSkills Verification

Unlike other registries that point to live repositories, TrustedSkills pins every skill to a verified commit hash. This protects you from malicious updates — what you install today is exactly what was reviewed and verified.

Security Audits

Gen Agent Trust HubPass
SocketPass
SnykPass

Details

Version
vlatest
License
Author
orchestra-research
Installs
38

🌐 Community

Passed automated security scans.