Skip to content

Personal Pipeline

An intelligent Model Context Protocol (MCP) server that transforms scattered operational knowledge into structured, actionable intelligence for automated incident response.

What is Personal Pipeline?

Personal Pipeline is specifically designed to support LangGraph agents handling monitoring alerts by providing context-aware retrieval of runbooks, decision trees, and operational procedures. It transforms your operational documentation into an intelligent system that can respond to incidents automatically.

A fully operational TypeScript/Node.js MCP server with enterprise-grade features.

🚀 Quick Start

Choose your installation method:

bash
# Method 1: From source (recommended for development)
git clone https://github.com/dpark2025/personal-pipeline.git
cd personal-pipeline
npm install && npm run build

# Method 2: Demo environment (fastest way to try it)
npm run demo:start

# Method 3: Docker (coming soon)
# docker run -p 3000:3000 personal-pipeline/mcp-server

🎯 Key Features

Dual Access Patterns

  • MCP Protocol: Native integration with LangGraph agents and MCP-compatible clients
  • REST API: 11 HTTP endpoints for external integrations and web UIs

Intelligence Layer

  • 7 MCP Tools for context-aware documentation retrieval
  • Sub-150ms response times for critical runbook retrieval
  • Confidence scoring for all recommendations
  • Decision trees for progressive incident resolution

Enterprise Performance

  • 99.9% uptime with circuit breaker resilience
  • 75% cache hit rate with hybrid Redis + memory caching
  • 50+ concurrent operations supported
  • Performance monitoring with real-time dashboards

📖 Documentation

Getting Started

API Reference

  • MCP Tools - 7 intelligent tools for documentation retrieval
  • REST API - 11 HTTP endpoints for integration

Examples & Guides

🏗️ Architecture

┌─────────────────────────────────────────────────────────────────┐
│                        Personal Pipeline                        │
│                    MCP Server Architecture                      │
└─────────────────────────────────────────────────────────────────┘

ACCESS PATTERNS:
  LangGraph Agent  ──────────►  MCP Protocol  ─┐
  External Systems ──────────►  REST API      ─┼──► Core Engine
  Demo Scripts     ──────────►  HTTP Client   ─┘

CORE ENGINE:
  ├── 7 MCP Tools (search_runbooks, get_decision_tree, etc.)
  ├── 11 REST Endpoints (search, health, performance, etc.)
  ├── Source Adapter Registry
  ├── Hybrid Caching Layer (Redis + Memory)
  └── Performance Monitoring

SOURCE ADAPTERS:
  ├── FileSystem Adapter     ──────► Local Files & Directories
  ├── Confluence Adapter     ──────► Confluence Spaces (Planned)
  ├── GitHub Adapter         ──────► GitHub Repositories (Planned)
  └── Database Adapter       ──────► PostgreSQL/MongoDB (Planned)

INFRASTRUCTURE:
  ├── Redis Cache            ──────► 60-80% MTTR reduction
  ├── Circuit Breakers       ──────► 99.9% uptime
  └── Health Monitoring      ──────► Real-time metrics

🛠️ Development

bash
# Development environment
npm run dev

# Performance testing  
npm run benchmark

# Enhanced MCP explorer
npm run mcp-explorer

# Health monitoring
npm run health:dashboard

🚀 Performance Metrics

  • Sub-2ms response time for critical runbook retrieval
  • 7/7 performance targets met in benchmark testing
  • 11 REST API endpoints with dual MCP/REST access
  • 99.9% uptime with circuit breaker resilience

Ready to get started? Check out our Installation Guide or try the Quick Start.

Minimalist Theme