Try the Live Demo
The fastest way to try VibeStream:
Quick API Test
Test the API with a single curl command:
curl -X POST https://vibestream-production-64f3.up.railway.app/check-once \
-H "Content-Type: application/json" \
-d '{
"youtube_url": "https://www.youtube.com/watch?v=LIVE_STREAM_ID",
"condition": "Are there people visible?"
}'
Replace LIVE_STREAM_ID with an actual YouTube Live stream ID. Regular videos
won’t work.
Self-Hosting
Installation
git clone https://github.com/raullenchai/vibestream
cd vibestream
pip install -r requirements-slim.txt
Configuration
Create a .env file:
# Required for live-monitor mode (Gemini is cheapest)
GOOGLE_API_KEY=your-gemini-api-key
# Required for live-digest mode
ANTHROPIC_API_KEY=your-anthropic-api-key
# Optional fallback
OPENAI_API_KEY=your-openai-api-key
You only need GOOGLE_API_KEY to get started. The other keys enable fallback
and summarization features.
Start the Server
python -m uvicorn api.server:app --host 0.0.0.0 --port 8000
Your First Live Monitor Job
Important constraints: - URL must be a live stream (not a regular
video or VOD) - Condition must be a yes/no question - Jobs auto-stop after
10 minutes
curl -X POST http://localhost:8000/live-monitor \
-H "Content-Type: application/json" \
-d '{
"youtube_url": "https://youtube.com/watch?v=LIVE_ID",
"condition": "Are there people walking?",
"webhook_url": "https://webhook.site/your-id"
}'
Response:
{
"job_id": "abc123...",
"status": "running",
"job_type": "live-monitor"
}
Check Job Status
curl http://localhost:8000/jobs/YOUR_JOB_ID
{
"job_id": "abc123",
"status": "running",
"details": {
"checks_performed": 15,
"triggers_fired": 2,
"frames_skipped": 42
}
}
frames_skipped shows how many frames the pre-filter saved you from sending
to the VLM API.
One-Shot Check
Don’t need continuous monitoring? Use /check-once for a single synchronous check:
curl -X POST http://localhost:8000/check-once \
-H "Content-Type: application/json" \
-d '{
"youtube_url": "https://youtube.com/watch?v=LIVE_ID",
"condition": "Is it raining?"
}'
{
"triggered": false,
"explanation": "No rain is visible. The sky appears clear and streets are dry.",
"model": "gemini-2.0-flash"
}
Deploy to Railway
One-click deployment with your API keys:
railway init
railway variables set GOOGLE_API_KEY=your-key
railway variables set ANTHROPIC_API_KEY=your-key
railway up
MCP for AI Agents
VibeStream exposes its functionality as MCP tools, allowing AI agents to interact with live streams.
Claude Desktop
Add to ~/.config/claude/claude_desktop_config.json:
{
"mcpServers": {
"vibestream": {
"command": "npx",
"args": [
"mcp-remote",
"https://vibestream-production-64f3.up.railway.app/mcp"
]
}
}
}
For local development, use http://localhost:8000/mcp.
check_once - One-shot condition check
live_monitor - Start continuous monitoring
live_digest - Generate periodic summaries
get_job_status - Check job status
cancel_job - Cancel a running job
Next Steps