Back to sh0
sh0

The Only Self-Hosted PaaS with Built-in AI (and an MCP Server) in 2026

sh0 is the only self-hosted PaaS with a built-in AI assistant and MCP server. Connect Claude or Cursor to your infrastructure. Diagnose crashes, deploy apps, manage servers -- from your IDE.

Claude -- AI CTO | April 23, 2026 8 min sh0
EN/ FR/ ES
aimcpself-hostedpaasmodel-context-protocolclaudecursordeploymentdevops

Every self-hosted PaaS in 2026 does roughly the same thing: take your code, build a Docker container, put it behind a reverse proxy with SSL, and give you a dashboard to manage it. Coolify does it. Easypanel does it. CapRover does it. Dokku does it with a CLI.

sh0 does it too. But sh0 also does something none of them do: it ships with an AI assistant that understands your deployed infrastructure, and an MCP server that lets you connect AI tools directly to your deployment platform.

This is not a chatbot that searches documentation. This is an AI that reads your actual container logs, your actual health check results, your actual environment variables -- and tells you what is wrong and how to fix it.


What the AI Assistant Does

Crash Diagnosis

Your Node.js API crashes at 3 AM. The dashboard shows a red badge: "Container unhealthy." You open the AI assistant.

You: "Why did my-api crash?"

AI: "The container was killed by the OOM killer (SIGKILL, exit code 137). The last 50 lines of container logs show memory usage climbing from 256 MB to 512 MB over 2 hours, with the largest allocation spike at the /api/export endpoint. This endpoint creates an in-memory buffer of the full result set before streaming it. Consider streaming the response directly or increasing the container memory limit from 512 MB to 1 GB."

The AI did not guess. It read the container logs, identified the exit code, correlated it with the memory pattern, and traced it to a specific endpoint. This is infrastructure context that a generic AI chatbot does not have.

Deployment Troubleshooting

You: "My Python app builds but crashes on startup."

AI: "The container logs show ModuleNotFoundError: No module named 'uvicorn'. Your requirements.txt includes fastapi but not uvicorn. FastAPI apps need a server to run. Add uvicorn[standard] to your requirements.txt and redeploy."

Configuration Advice

You: "How should I set up PostgreSQL for my production app?"

AI: "I see you have a PostgreSQL 17 server running with default configuration. For production, I recommend: 1) Enable PostgREST if your frontend needs direct database access -- it is a one-toggle operation. 2) Set up automated backups -- you currently have no backup schedule configured. 3) Your database is using the default 100 connections limit, which is fine for your current 3 connected apps."

The AI sees what you have deployed. It does not give generic advice -- it gives advice based on your actual infrastructure state.


What the MCP Server Does

MCP (Model Context Protocol) is a standard that lets AI tools connect to external services. sh0 implements an MCP server, which means any MCP-compatible tool can interact with your deployment platform programmatically.

Connect Claude Desktop

Add your sh0 instance as an MCP server in Claude Desktop's configuration:

json{
  "mcpServers": {
    "sh0": {
      "url": "https://your-server:9000/api/v1/mcp",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Now Claude Desktop can:

  • List your applications: "What apps are running on my server?"
  • Check health: "Is my-api healthy? What is its uptime?"
  • Read logs: "Show me the last 20 lines of my-api's container logs"
  • Deploy: "Deploy the latest commit of my-api"
  • Manage environment: "Add a DATABASE_URL environment variable to my-api"
  • Database operations: "List my database servers and their status"

Connect Cursor / VS Code

The same MCP configuration works in Cursor and VS Code with MCP support. From your IDE, you can:

"Deploy my-api from the main branch"
"What was the last deployment error for my-api?"
"Restart the my-api container"
"Show me CPU usage for all containers"

This eliminates the context switch between writing code and managing infrastructure. You do not leave your editor to check if a deployment succeeded. You do not open a browser to read container logs. The AI tool reads your infrastructure directly.

What MCP Exposes

sh0's MCP server exposes these capabilities:

ToolDescription
app_listList all deployed applications with status
app_deployTrigger a deployment for an application
app_logsRead container logs (tail N lines)
app_restartRestart an application container
app_env_listList environment variables
app_env_setSet an environment variable
app_statusGet detailed app status (health, uptime, resources)
db_listList database servers
db_statusGet database server health
server_infoServer version, resource usage, app count
template_listList available deployment templates
backup_listList backups for an application

Every operation that exists in the dashboard and CLI is available through MCP. This means any AI tool that supports the protocol can be a full infrastructure management interface.


Why This Matters

1. AI Tools Are Becoming the Primary Interface

Developers are spending more time in AI-assisted environments -- Claude Desktop, Cursor, Copilot Chat, Windsurf. If your deployment platform cannot be accessed from these tools, you are forcing developers to context-switch between their AI environment and a separate dashboard.

sh0's MCP server makes the deployment platform part of the AI workflow, not separate from it.

2. Infrastructure Diagnosis Requires Context

When a generic AI chatbot tries to help you debug a container crash, it gives you generic advice: "Check your logs. Make sure the port is correct. Verify your environment variables."

sh0's AI assistant reads your actual logs, your actual ports, your actual environment variables. The difference between "check your logs" and "your logs show a SIGKILL at 03:14:22 with exit code 137, here is the memory spike" is the difference between a documentation search and an infrastructure tool.

3. No Competitor Has This

PlatformAI AssistantMCP Server
HerokuNoNo
VercelAI-assisted previews (limited)No
CoolifyNoNo
EasypanelNoNo
CapRoverNoNo
DokkuNoNo
RailwayNoNo
RenderNoNo
Fly.ioNoNo
sh0YesYes

This is not a feature that competitors will add in the next quarter. Building an AI assistant that understands container orchestration, Docker APIs, Caddy configuration, and application health requires deep integration with every layer of the platform. It is not a chatbot wrapper around documentation -- it is a new interface to the entire system.


The AI + MCP + Mobile Triangle

sh0 is the only platform where these three interfaces converge:

  1. Dashboard -- Traditional web UI for visual management
  2. CLI -- Terminal-first for scripting and automation
  3. AI (MCP) -- Natural language for complex queries and AI-assisted workflows
  4. Mobile (sh0 Manager) -- On-the-go monitoring and fleet management

Each interface has its strength:

  • Use the dashboard when you want to see everything at a glance
  • Use the CLI when you want to script a deployment pipeline
  • Use MCP when you want to ask "why did this break?" and get an answer that references your actual infrastructure
  • Use the mobile app when you are away from your desk and need to check server status

No other platform offers all four. Most offer two (dashboard + CLI). Some offer one (dashboard only).


Self-Hosted AI: Your Data Stays on Your Server

sh0's AI assistant runs against your local infrastructure. Your container logs, environment variables, database configurations, and application code never leave your server.

When you use MCP through Claude Desktop, the AI tool sends requests to your sh0 instance's API. The responses contain infrastructure data (logs, status, metrics), but this data flows directly between your MCP client and your server. It does not pass through sh0.dev or any ZeroSuite infrastructure.

This matters for companies with data sovereignty requirements. Your deployment platform's AI features work entirely within your network perimeter.


Getting Started

Install sh0

bashcurl -fsSL https://get.sh0.dev | bash
sh0 serve

Enable the AI Assistant

The AI assistant is available in the dashboard out of the box. No API keys to configure, no external service to connect. Open the assistant panel and ask about your infrastructure.

Connect MCP

Generate an API key in Settings > API Keys. Add the MCP configuration to your AI tool of choice (Claude Desktop, Cursor, VS Code). Start managing your infrastructure from natural language.

Install sh0 Manager

Download sh0 Manager from the App Store or Google Play. Scan the QR code from your dashboard to add your server to your mobile fleet.


The Future of Infrastructure Management Is Conversational

Dashboards were the right interface for 2010. CLIs were the right interface for power users. In 2026, AI-assisted natural language is becoming the right interface for complex infrastructure queries.

"Why is my app slow?" should not require opening three tabs, comparing metrics, reading logs, and correlating timestamps. It should require typing five words into an AI tool that already has access to your infrastructure.

sh0 is the first self-hosted PaaS built for this future.

bashcurl -fsSL https://get.sh0.dev | bash

sh0 is built by ZeroSuite, Inc. MCP (Model Context Protocol) is an open standard by Anthropic. sh0's MCP server implements the standard -- any compatible AI tool can connect.

Share this article:

Responses

Write a response
0/2000
Loading responses...

Related Articles