Skip to main content

Overview

VibeStream exposes its functionality as MCP (Model Context Protocol) tools, allowing AI agents to interact with live streams programmatically.
MCP is mounted at /mcp on the VibeStream server and uses Server-Sent Events (SSE) for communication.

Quick Start

Connect your AI agent to the VibeStream MCP server:
# MCP endpoint
https://vibestream-production-64f3.up.railway.app/mcp

Available Tools

VibeStream provides 5 MCP tools:

check_once

One-shot condition check on a live stream

live_monitor

Start continuous monitoring for a condition

live_digest

Generate periodic summaries of stream content

get_job_status

Check the status of a running job

cancel_job

Cancel a running job

check_once

Perform a single condition check on a YouTube Live stream.

Parameters

NameTypeRequiredDescription
youtube_urlstringYouTube Live URL to check
conditionstringNatural language condition (e.g., “Is someone holding a red umbrella?”)
modelstringVLM model: gemini-2.5-flash, gpt-4o-mini, claude-3-5-sonnet (default: gemini-2.5-flash)
include_framebooleanInclude base64 frame in response (default: false)

Example

# Using Claude or another MCP-enabled agent
result = await check_once(
    youtube_url="https://youtube.com/watch?v=LIVE_ID",
    condition="Is it raining?",
    model="gemini-2.5-flash"
)
# Returns: "Triggered: NO\nExplanation: Clear skies visible...\nModel: gemini-2.5-flash"

Response Format

Triggered: YES/NO
Explanation: [VLM analysis of the condition]
Model: [model used]

live_monitor

Start a background job that continuously monitors for a condition and sends webhooks when triggered.

Parameters

NameTypeRequiredDescription
youtube_urlstringYouTube Live URL to monitor
conditionstringNatural language condition to watch for
webhook_urlstringWebhook URL for notifications
interval_secondsintPolling interval 5-300 (default: 10)
modelstringVLM model (default: gemini-2.5-flash)
enable_prefilterbooleanUse motion detection pre-filtering (default: true)

Example

result = await live_monitor(
    youtube_url="https://youtube.com/watch?v=LIVE_ID",
    condition="Is someone walking a dog?",
    webhook_url="https://webhook.site/your-id",
    interval_seconds=15
)
# Returns: "Job started successfully!\nJob ID: abc123...\n..."

Response Format

Job started successfully!
Job ID: [uuid]
Status: running
Monitoring: [youtube_url]
Condition: [condition]
Webhook: [webhook_url]
Jobs auto-stop after 10 minutes or when the condition is triggered.

live_digest

Start a background job that generates periodic narrative summaries of stream content.
The MCP tool uses webhooks to deliver summaries to agents. The REST API (/live-digest) uses Server-Sent Events (SSE) for browser clients.

Parameters

NameTypeRequiredDescription
youtube_urlstringYouTube Live URL to summarize
webhook_urlstringWebhook URL for summaries
window_minutesintSummary window 1-60 (default: 10)
capture_interval_secondsintFrame capture interval 10-300 (default: 60)
modelstringVLM model (default: claude-3-5-sonnet)

Example

result = await live_digest(
    youtube_url="https://youtube.com/watch?v=LIVE_ID",
    webhook_url="https://webhook.site/your-id",
    window_minutes=5,
    capture_interval_seconds=30
)
# Returns: "Job started successfully!\nJob ID: xyz789...\n..."

Response Format

Job started successfully!
Job ID: [uuid]
Status: running
Summarizing: [youtube_url]
Window: [window_minutes] minutes
Webhook: [webhook_url]

get_job_status

Check the status and details of a running job.

Parameters

NameTypeRequiredDescription
job_idstringJob ID from live_monitor or live_digest

Example

result = await get_job_status(job_id="abc123-...")
# Returns: "Job ID: abc123...\nType: live-monitor\nStatus: running\n..."

Response Format

Job ID: [uuid]
Type: [live-monitor or live-digest]
Status: [pending, running, stopped, failed]
URL: [youtube_url]
Created: [ISO timestamp]
Details: [job-specific details]

cancel_job

Cancel a running monitoring or digest job.

Parameters

NameTypeRequiredDescription
job_idstringJob ID to cancel

Example

result = await cancel_job(job_id="abc123-...")
# Returns: "Job cancelled successfully!\nJob ID: abc123...\nFinal Status: stopped"

Response Format

Job cancelled successfully!
Job ID: [uuid]
Final Status: stopped

Integration Examples

Claude Desktop

Add to your Claude Desktop MCP configuration (~/.config/claude/claude_desktop_config.json):
{
  "mcpServers": {
    "vibestream": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://vibestream-production-64f3.up.railway.app/mcp"
      ]
    }
  }
}
For local development, replace the URL with http://localhost:8000/mcp.

Gemini/Antigravity

Configure the MCP server in your agent’s settings, then ask:
“Check the weather stream at https://youtube.com/watch?v=LIVE_ID and tell me if it’s raining”
The agent will automatically use the check_once tool.

Python SDK

from mcp import ClientSession
from mcp.client.sse import sse_client

async def main():
    async with sse_client("https://vibestream-production-64f3.up.railway.app/mcp") as (read, write):
        async with ClientSession(read, write) as session:
            await session.initialize()
            
            # List available tools
            tools = await session.list_tools()
            print([t.name for t in tools.tools])
            # Output: ['check_once', 'live_monitor', 'live_digest', 'get_job_status', 'cancel_job']
            
            # Call a tool
            result = await session.call_tool(
                "check_once",
                arguments={
                    "youtube_url": "https://youtube.com/watch?v=LIVE_ID",
                    "condition": "Are there people visible?"
                }
            )
            print(result.content[0].text)

Error Handling

All MCP tools return error messages as strings when something goes wrong:
Error: URL is not a live stream - Not a livestream
Error validating URL: [error message]
Error: Job not found - [job_id]
Error: [general error message]
Your agent should check for responses starting with “Error:” to handle failures gracefully.