Awesome MCP Servers for Developer Tools

8797 MCP Servers Found

Greb Mcp

Vaibhav Raina

GREB MCP Server Semantic code search for AI agents without indexing your codebase or storing any data. Fast and accurate. Available on npm (cheetah-greb) and PyPI (cheetah-greb). FEATURES - Natural Language Search: Describe what you're looking for in plain English - High-Precision Results: Smart ranking returns the most relevant code first - Works with Any MCP Client: Claude Desktop, Cursor, Windsurf, Cline, Kiro, and more - No Indexing Required: Search any codebase instantly without setup - Fast: Results in under 5 seconds even for large repositories INSTALLATION Install Greb globally using pip or npm. Python: pip install cheetah-greb Node.js: npm install -g cheetah-greb GET YOUR API KEY 1. Go to Dashboard > API Keys at https://grebmcp.com/dashboard/api-keys 2. Click "Create API Key" 3. Copy the key (starts with grb_) CONFIGURATION Add to your MCP client config (Cursor, Windsurf, Claude Desktop, Kiro, etc.): Python installation: { "mcpServers": { "greb-mcp": { "command": "greb-mcp", "env": { "GREB_API_KEY": "grb_your_api_key_here" } } } } Node.js installation: { "mcpServers": { "greb-mcp": { "command": "greb-mcp-js", "env": { "GREB_API_KEY": "grb_your_api_key_here" } } } } CLAUDE CODE SETUP Mac/Linux (Python): claude mcp add --transport stdio greb-mcp --env GREB_API_KEY=grb_your_api_key_here -- greb-mcp Windows PowerShell (Python): claude mcp add greb-mcp greb-mcp --transport stdio --env "GREB_API_KEY=grb_your_api_key_here" Mac/Linux (Node.js): claude mcp add --transport stdio greb-mcp --env GREB_API_KEY=grb_your_api_key_here -- greb-mcp-js Windows PowerShell (Node.js): claude mcp add greb-mcp greb-mcp-js --transport stdio --env "GREB_API_KEY=grb_your_api_key_here" TOOL: code_search Search code using natural language queries powered by AI. Parameters: - query (string, required): Natural language search query - keywords (object, required): Search configuration - keywords.primary_terms (string array, required): High-level semantic terms (e.g., "authentication", "database") - keywords.code_patterns (string array, optional): Literal code patterns to grep for - keywords.file_patterns (string array, required): File extensions to search (e.g., ["*.ts", "*.js"]) - keywords.intent (string, required): Brief description of what you're looking for - directory (string, required): Full absolute path to directory to search Example: { "query": "find authentication middleware", "keywords": { "primary_terms": ["authentication", "middleware", "jwt"], "code_patterns": ["authenticate(", "isAuthenticated"], "file_patterns": ["*.js", "*.ts"], "intent": "find auth middleware implementation" }, "directory": "/Users/dev/my-project" } Response includes: - File paths - Line numbers - Relevance scores - Code content - Reasoning for each match USAGE EXAMPLES Ask your AI assistant to search code naturally: "Use greb mcp to find authentication middleware" "Use greb mcp to find all API endpoints" "Use greb mcp to look for database connection setup" "Use greb mcp to find where user validation happens" "Use greb mcp to search for error handling patterns" LINKS Website: https://grebmcp.com Documentation: https://grebmcp.com/docs Get API Key: https://grebmcp.com/dashboard/api-keys

starstarstarstarstar

2 months ago

Codeconductor Ide Orchestrator

CodeConductor Team

CodeConductor IDE Orchestrator gives Claude Desktop full access to your development environment through a secure MCP server. It connects Claude to your VS Code workspace, enabling real file operations, code navigation, Git workflows, and automated development tasks. The server exposes 24 IDE tools, including: File operations — open, read, write, list, search Code intelligence — go to definition, find references, diagnostics Git automation — status, diff, stage, commit, push (Pro) Command execution — run terminal commands safely (Pro) Command Execution Safety: Pro users can run commands through Claude, but everything is protected by a three-tier safety system. Safe commands run automatically, while higher-risk commands require explicit confirmation and are screened for dangerous patterns. This prevents accidental damage and blocks command injection attempts. All operations run locally on your machine, never transmit code, and follow a tier-based permission system (Free vs Pro). Requires the CodeConductor VS Code extension to function. This orchestrator turns Claude into a capable, controlled development assistant that operates directly inside your real project.

starstarstarstarstar

2 months ago

Prompt Optimizer Local

Prompt Optimizer

Transform AI prompts with intelligent context detection, 50+ professional optimization techniques, and seamless team collaboration. Integrated with Claude Desktop, Cursor, Windsurf, and 17+ MCP clients. • **100% Local Processing** - All prompt optimization is done on your machine, ensuring complete privacy and confidentiality. • **Offline Capability** - Works without an internet connection, making it ideal for secure or air-gapped environments. • **Advanced Local Prompt Intelligence** - Sophisticated content analysis and optimization performed directly on your machine, including context-aware optimization for debugging and technical prompts. • **Cross-Platform Support** - Universal compatibility for Windows, macOS, and Linux. • **Binary Integrity Verification** - SHA256 hash validation ensures the integrity of the local server. • **Technical Parameter Preservation** - Maintains code blocks, API calls, and other technical details during optimization, including parameters like --ar and --v. • **Debugging Scenario Detection** - Context-aware optimization tailored for debugging and technical prompts.

starstarstarstarstar

2 months ago

Prompt Optimizer

nivlewd1

Our system automatically analyzes your prompts to detect whether they're for image generation (like Midjourney), LLM interaction, or technical automation, then applies the most effective optimization techniques for that context. • **100% Local Processing** - All prompt optimization is done on your machine, ensuring complete privacy and confidentiality. • **Offline Capability** - Works without an internet connection, making it ideal for secure or air-gapped environments. • **Advanced Local Prompt Intelligence** - Sophisticated content analysis and optimization performed directly on your machine, including context-aware optimization for debugging and technical prompts. • **Cross-Platform Support** - Universal compatibility for Windows, macOS, and Linux. • **Binary Integrity Verification** - SHA256 hash validation ensures the integrity of the local server. • **Technical Parameter Preservation** - Maintains code blocks, API calls, and other technical details during optimization, including parameters like --ar and --v. • **Debugging Scenario Detection** - Context-aware optimization tailored for debugging and technical prompts.

starstarstarstarstar

2 months ago

Sonatype Dependency Management Mcp Server

sonatype

The Sonatype MCP Server enables AI assistants to access Sonatype's comprehensive dependency intelligence directly within your development workflow. By integrating with the Model Context Protocol, your AI assistant can help you make informed decisions about dependencies, identify security risks, and maintain compliance — all without leaving your IDE. Key Features Component Version Selection - Select the best version the first time, without the side quest Security Vulnerability Scanning - Identify known vulnerabilities in your project dependencies License Compliance Checking - Ensure your dependencies meet your organization's license policies Dependency Health Analysis - Get insights into dependency quality, maintenance status, and risk factors Real-time Security Advisories - Stay informed about the latest security threats affecting your dependencies Remediation Guidance - Receive actionable recommendations to fix vulnerabilities and compliance issues

starstarstarstarstar

3 months ago

Smart Ai Bridge

Platano78

Smart AI Bridge is a production-ready Model Context Protocol (MCP) server that orchestrates AI-powered development operations across multiple backends with automatic failover, smart routing, and advanced error prevention capabilities. Key Features 🤖 Multi-AI Backend Orchestration Pre-configured 4-Backend System: 1 local model + 3 cloud AI backends (fully customizable - bring your own providers) Fully Expandable: Add unlimited backends via EXTENDING.md guide Intelligent Routing: Automatic backend selection based on task complexity and content analysis Health-Aware Failover: Circuit breakers with automatic fallback chains Bring Your Own Models: Configure any AI provider (local models, cloud APIs, custom endpoints) 🎨 Bring Your Own Backends: The system ships with example configuration using local LM Studio and NVIDIA cloud APIs, but supports ANY AI providers - OpenAI, Anthropic, Azure OpenAI, AWS Bedrock, custom APIs, or local models via Ollama/vLLM/etc. See EXTENDING.md for integration guide. 🎯 Advanced Fuzzy Matching Three-Phase Matching: Exact (<5ms) → Fuzzy (<50ms) → Suggestions (<100ms) Error Prevention: 80% reduction in "text not found" errors Levenshtein Distance: Industry-standard similarity calculation Security Hardened: 9.7/10 security score with DoS protection Cross-Platform: Automatic Windows/Unix line ending handling 🛠️ Comprehensive Toolset 19 Total Tools: 9 core tools + 10 intelligent aliases Code Review: AI-powered analysis with security auditing File Operations: Advanced read, edit, write with atomic transactions Multi-Edit: Batch operations with automatic rollback Validation: Pre-flight checks with fuzzy matching support 🔒 Enterprise Security Security Score: 9.7/10 with comprehensive controls DoS Protection: Complexity limits, iteration caps, timeout enforcement Input Validation: Type checking, structure validation, sanitization Metrics Tracking: Operation monitoring and abuse detection Audit Trail: Complete logging with error sanitization 🏆 Production Ready: 100% test coverage, enterprise-grade reliability, MIT licensed 🚀 Multi-Backend Architecture Flexible 4-backend system pre-configured with 1 local + 3 cloud backends for maximum development efficiency. The architecture is fully expandable - see EXTENDING.md for adding additional backends. 🎯 Pre-configured AI Backends The system comes with 4 specialized backends (fully expandable via EXTENDING.md): Cloud Backend 1 - Coding Specialist (Priority 1) Specialization: Advanced coding, debugging, implementation Optimal For: JavaScript, Python, API development, refactoring, game development Routing: Automatic for coding patterns and task_type: 'coding' Example Providers: OpenAI GPT-4, Anthropic Claude, Qwen via NVIDIA API, Codestral, etc. Cloud Backend 2 - Analysis Specialist (Priority 2) Specialization: Mathematical analysis, research, strategy Features: Advanced reasoning capabilities with thinking process Optimal For: Game balance, statistical analysis, strategic planning Routing: Automatic for analysis patterns and math/research tasks Example Providers: DeepSeek via NVIDIA/custom API, Claude Opus, GPT-4 Advanced, etc. Local Backend - Unlimited Tokens (Priority 3) Specialization: Large context processing, unlimited capacity Optimal For: Processing large files (>50KB), extensive documentation, massive codebases Routing: Automatic for large prompts and unlimited token requirements Example Providers: Any local model via LM Studio, Ollama, vLLM - DeepSeek, Llama, Mistral, Qwen, etc. Cloud Backend 3 - General Purpose (Priority 4) Specialization: General-purpose tasks, additional fallback capacity Optimal For: Diverse tasks, backup routing, multi-modal capabilities Routing: Fallback and general-purpose queries Example Providers: Google Gemini, Azure OpenAI, AWS Bedrock, Anthropic Claude, etc. 🎨 Example Configuration: The default setup uses LM Studio (local) + NVIDIA API (cloud), but you can configure ANY providers. See EXTENDING.md for step-by-step instructions on integrating OpenAI, Anthropic, Azure, AWS, or custom APIs. 🧠 Smart Routing Intelligence Advanced content analysis with empirical learning: // Smart Routing Decision Tree if (prompt.length > 50,000) → Local Backend (unlimited capacity) else if (math/analysis patterns detected) → Cloud Backend 2 (analysis specialist) else if (coding patterns detected) → Cloud Backend 1 (coding specialist) else → Default to Cloud Backend 1 (highest priority) Pattern Recognition: Coding Patterns: function|class|debug|implement|javascript|python|api|optimize Math/Analysis Patterns: analyze|calculate|statistics|balance|metrics|research|strategy Large Context: File size >100KB or prompt length >50,000 characters

starstarstarstarstar

4 months ago

Icons8 MCP Server

Icons8

Icons8 MCP Server gives AI coding environments instant access to 368,865+ Icons8 icons across 116 design styles. Connect your Claude, Cursor, Windsurf, VS Code, or any SSE-capable MCP host to stream high-res PNGs for free or unlock full SVG delivery with an API key—ideal for prototyping, production code, and rapid UI experiments. Use cases - Replace entire icon sets in existing projects with consistent, on-trend styles. - Add icons to bullet lists, feature highlights, process steps, and dashboards without leaving your IDE. - Prototype quickly with free PNG previews, then upgrade to SVGs for production-ready assets. - Build AI-assisted workflows that search, preview, and drop icons directly into your code. - Showcase interactivity with ready-made demos such as falling emojis, sci-fi dashboards, and product catalogs.

starstarstarstarstar

4 months ago

Mcp Datadog Server

ClaudioLazaro

MCP Datadog Server Servidor MCP (Model Context Protocol) completo e robusto para integração com APIs do Datadog Um servidor MCP de produção que oferece 351 tools para interagir com todas as APIs do Datadog através de LLMs, incluindo operações CRUD completas, tools curadas e ferramentas geradas automaticamente do schema. Node.js MCP License 🚀 Características Principais 📊 Tools Disponíveis (351 total) 9 Tools Curadas 🎯 - Handcrafted, otimizadas para casos específicos 25 Tools CRUD ⚡ - Operações CREATE, READ, UPDATE, DELETE para recursos principais 319 Tools Geradas 🔧 - Geradas automaticamente do schema oficial do Datadog 🔍 Recursos Avançados ✅ Autodescoberta de Schema - LLMs descobrem parâmetros automaticamente ✅ Validação Robusta - Zod schemas com validação completa ✅ Progress Tracking - Acompanhamento em tempo real para operações longas ✅ Error Handling - Tratamento inteligente de erros e retry automático ✅ CLI Rica - Interface completa para gestão e debugging 🛡️ Conformidade MCP ✅ 100% Compatível com TypeScript SDK oficial ✅ JSON Schema completo para todas as tools ✅ Metadata Annotations detalhadas ✅ Type Safety com validação Zod

starstarstarstarstar

4 months ago

Codegraph Mcp

Jakedismo

# Transform any MCP-compatible LLM into a codebase expert through semantic intelligence A blazingly fast graphRAG implementation. 100% Rust for indexing and querying large codebases with natural language. Supports multiple embedding providers: modes cpu (no graph just AST parsing), onnx (blazingly fast medium quality embeddings with Qdrant/all-MiniLM-L6-v2-onnx) and Ollama (time consuming SOTA embeddings with hf.co/nomic-ai/nomic-embed-code-GGUF:Q4_K_M). I would argue this is the fastest codebase indexer on the Github atm. Includes a Rust SDK made stdio MCP server so that your agents can query the indexed codegraph with natural language and get deep insights from your codebase before starting development or making changes. Currently supports typescript, javascript, rust, go, Python and C++ codebases. 📊 Performance Benchmarking (M4 Max 128GB) Production Codebase Results (1,505 files, 2.5M lines, Python, Javascript, Typescript and Go) 🎉 INDEXING COMPLETE! 📊 Performance Summary ┌───────────────. ─┐ │ 📄 Files: 1,505 indexed │ │ 📝 Lines: 2,477,824 processed │ │ 🔧 Functions: 30,669 extracted │ │ 🏗️ Classes: 880 extracted │ │ 💾 Embeddings: 538,972 generated │ └───────────────. ─┘ Embedding Provider Performance Comparison Provider Time Quality Use Case 🧠 Ollama nomic-embed-code ~15-18h SOTA retrieval accuracy Production, smaller codebases ⚡ ONNX all-MiniLM-L6-v2 32m 22s Good general embeddings Large codebases, lunch-break indexing 📚 LEANN ~4h The next best thing I could find in Github CodeGraph Advantages ✅ Incremental Updates: Only reprocess changed files (LEANN can't do this) ✅ Provider Choice: Speed vs. quality optimization based on needs ✅ Memory Optimization: Automatic optimisations based on your system ✅ Production Ready: Index 2.5M lines while having lunch Read the README.md carefully the installation is complex and requires you to download the embedding model in onnx format and Ollama and setting up multiple environment variables (I would recommend setting these in your bash configuration)

starstarstarstarstar

4 months ago

Codegraph Rust

Jakedismo

🎯 Overview CodeGraph is a powerful CLI tool that combines MCP (Model Context Protocol) server management with sophisticated code analysis capabilities. It provides a unified interface for indexing projects, managing embeddings, and running MCP servers with multiple transport options. All you now need is an Agent(s) to create your very own deep code and project knowledge synthehizer system! Key Capabilities 🔍 Advanced Code Analysis: Parse and analyze code across multiple languages using Tree-sitter 🚄 Dual Transport Support: Run MCP servers with STDIO, HTTP, or both simultaneously 🎯 Vector Search: Semantic code search using FAISS-powered vector embeddings 📊 Graph-Based Architecture: Navigate code relationships with RocksDB-backed graph storage ⚡ High Performance: Optimized for large codebases with parallel processing and batched embeddings 🔧 Flexible Configuration: Extensive configuration options for embedding models and performance tuning RAW PERFORMANCE ✨✨✨ 170K lines of rust code in 0.49sec! 21024 embeddings in 3:24mins! On M3 Pro 32GB Qdrant/all-MiniLM-L6-v2-onnx on CPU no Metal acceleration used! Parsing completed: 353/353 files, 169397 lines in 0.49s (714.5 files/s, 342852 lines/s) [00:03:24] [########################################] 21024/21024 Embeddings complete ✨ Features Core Features Project Indexing Multi-language support (Rust, Python, JavaScript, TypeScript, Go, Java, C++) Incremental indexing with file watching Parallel processing with configurable workers Smart caching for improved performance MCP Server Management STDIO transport for direct communication HTTP streaming with SSE support Dual transport mode for maximum flexibility Background daemon mode with PID management Code Search Semantic search using embeddings Exact match and fuzzy search Regex and AST-based queries Configurable similarity thresholds Architecture Analysis Component relationship mapping Dependency analysis Code pattern detection Architecture visualization support

starstarstarstarstar

4 months ago

Scalekit

Scalekit Model Context Protocol (MCP) server provides comprehensive tools for managing environments, organizations, users, connections, and workspace operations. Built for developers who want to connect their AI tools to Scalekit context and capabilities based on simple natural language queries. This MCP server enables AI assistants to interact with Scalekit’s identity and access management platform through a standardized set of tools. It provides secure, OAuth-protected access to manage environments, organizations, users, authentication connections, and more. Features Environment management and configuration Organization and user management Workspace member administration OIDC connection setup and management MCP server registration and configuration Role and scope management Admin portal link generation Configuration The Scalekit MCP server can be configured to support OAuth for compatible clients. If your MCP Client doesn’t support OAuth based authorization for MCP Servers, you can still use the Scalekit MCP server with the mcp-remote acting as a local proxy to add OAuth support. ## using OAuth: { "servers": { "scalekit": { "type": "http", "url": "https://mcp.scalekit.com/" } } } ## using mcp-remote: { "mcpServers": { "scalekit": { "command": "npx", "args": ["-y", "mcp-remote", "https://mcp.scalekit.com/"] } } }

starstarstarstarstar

5 months ago

Pyghidra Mcp

clearbluejar

PyGhidra-MCP - Ghidra Model Context Protocol Server pyghidra-mcp is a command-line Model Context Protocol (MCP) server that brings the full analytical power of Ghidra, a robust software reverse engineering (SRE) suite, into the world of intelligent agents and LLM-based tooling. It bridges Ghidra’s ProgramAPI and FlatProgramAPI to Python using pyghidra and jpype, then exposes that functionality via the Model Context Protocol. MCP is a unified interface that allows language models, development tools (like VS Code), and autonomous agents to access structured context, invoke tooling, and collaborate intelligently. Think of MCP as the bridge between powerful analysis tools and the LLM ecosystem. With pyghidra-mcp, Ghidra becomes an intelligent backend—ready to respond to context-rich queries, automate deep reverse engineering tasks, and integrate into AI-assisted workflows. Yet another Ghidra MCP? Yes, the original ghidra-mcp is fantastic. But pyghidra-mcp takes a different approach: 🐍 No GUI required – Run entirely via CLI for streamlined automation and scripting. 🔁 Designed for automation – Ideal for integrating with LLMs, CI pipelines, and tooling that needs repeatable behavior. ✅ CI/CD friendly – Built with robust unit and integration tests for both client and server sessions. 🚀 Quick startup – Supports fast command-line launching with minimal setup. 📦 Project-wide analysis – Enables concurrent reverse engineering of all binaries in a Ghidra project 🤖 Agent-ready – Built for intelligent agent-driven workflows and large-scale reverse engineering automation. 🔍 Semantic code search – Uses vector embeddings (via ChromaDB) to enable fast, fuzzy lookup across decompiled functions, comments, and symbols—perfect for pseudo-C exploration and agent-driven triage.

starstarstarstarstar

5 months ago

Docker Mcp Server

docker-mcp-server

Docker MCP Server A comprehensive Model Context Protocol (MCP) server that provides advanced Docker operations through a unified interface. This server combines 16 powerful Docker MCP tools with 25+ convenient CLI aliases to create a complete Docker workflow solution for developers, DevOps engineers, and system administrators. 🌟 What Makes Docker MCP Server Special Docker MCP Server is not just another Docker wrapper - it's a complete Docker workflow enhancement system designed to make Docker operations more intuitive, secure, and efficient: 🎯 Unified Interface MCP Protocol Integration: Seamlessly works with MCP-compatible tools and IDEs CLI Convenience: 25+ carefully crafted aliases for common Docker workflows Consistent API: All operations follow the same patterns and conventions Cross-Platform: Full support for Linux, macOS, and Windows environments 🔒 Security-First Design Docker-Managed Security: All password operations handled by Docker daemon for maximum security Zero Password Exposure: Passwords never appear in command history, process lists, or arguments Token Authentication Support: Full support for Personal Access Tokens and service accounts Registry Flexibility: Secure login to Docker Hub, AWS ECR, Azure ACR, Google GCR, and custom registries CI/CD Security: Secure stdin password input for automated deployment pipelines Permission Management: Proper handling of Docker daemon permissions and credential storage 🚀 Developer Experience Comprehensive Help System: Every command includes detailed documentation with --help Smart Defaults: Sensible default configurations for common use cases Error Prevention: Built-in safety checks and confirmation prompts for destructive operations Rich Output: Formatted, colored output with clear status indicators 📊 Advanced Operations Complete Container Lifecycle: From build to publish with comprehensive registry support Multi-Container Management: Docker Compose integration with service orchestration Registry Publishing: Advanced image publishing with multi-platform support and automated workflows Network & Volume Management: Advanced networking and storage operations System Maintenance: Intelligent cleanup tools with multiple safety levels Development Workflows: Specialized commands for development environments

starstarstarstarstar

5 months ago

Master Mcp Server

Jakedismo

Master MCP Server Master MCP Server aggregates multiple MCP servers behind a single, secure endpoint. It provides configuration-driven module loading, unified capability discovery, request routing with resilience, and first-class OAuth flows for multi-backend authentication. # Highlights - Aggregates multiple MCP servers with tool/resource discovery and namespacing - OAuth support: master token pass-through, delegated provider flows, proxy refresh - Config-driven setup with JSON/YAML, schema validation, and secret resolution - Resilient routing: load-balancing, retries with backoff/jitter, circuit-breakers - Cross-platform: Node.js server and Cloudflare Workers runtime - Production-ready deployment: Docker, Cloudflare Workers, Koyeb - Testing strategy and CI-ready structure

starstarstarstarstar

5 months ago

Xcode Mcp Server (drewster99)

drewster99

An MCP (Model Context Protocol) server for controlling and interacting with Xcode from AI assistants and LLMs like Claude Code, Cursor, Claude Desktop, LM Studio, etc. This server significantly improves the build cycle. Now Claude (or your favorite tool) can directly command Xcode to build your project. Because Xcode is building it directly (rather than xcodebuild command-line or similar), the build happens exactly the same way as when you build it in Xcode. Xcode-mcp-server returns relevant build errors or warnings back to your coding tool (like Cursor or Claude Code), so the LLM sees exactly the same errors you do. Included tool functions here - you don't really need to know this info because your coding LLM will get this info (and more details) automatically, but I've included it here for the curious: - version - Returns xcode-mcp-server's version string - get_xcode_projects - Finds all .xcodeproj and .xcworkspace projects in the given search_path. If search_path is empty, all paths to which the tool has been granted access are searched - get_project_hierarchy - Returns the path hierarchy of the project or workspace - get_project_schemes - Returns a list of build schemes for the specified project - build_project - Commands Xcode to build. This is the workhorse that builds your project again and again, returning success or build errors - run_project - Commands Xcode to run your project - get_build_errors - Returns most recent build errors from the given project - clean_project - Cleans build

starstarstarstarstar

5 months ago

Maven Tools Mcp Server

arvindand

# Maven Tools MCP - AI-Powered Maven Central Intelligence MCP server providing instant, accurate dependency analysis for Maven, Gradle, SBT, Mill, and all JVM build tools. Get dependency intelligence that's faster and more reliable than web searches. ## Key Features - **Bulk Operations**: Analyze 20+ dependencies in one call (<500ms vs 60+ seconds manually) - **Universal JVM Support**: Works with Maven, Gradle, SBT, Mill using standard Maven coordinates - **Version Intelligence**: Automatic classification (stable/RC/beta/alpha) with stability filtering - **Age Analysis**: Classify dependencies as fresh/current/aging/stale with actionable insights - **Context7 Integration**: Smart documentation hints for complex upgrades and migrations - **Enterprise Performance**: <100ms cached responses, native images ## Perfect For - "Check all dependencies in this build file for latest versions" - "Show me only stable versions for production deployment" - "How old are my dependencies and which ones need attention?" - "Compare my current versions but only suggest stable upgrades" **Docker Installation**: One command setup with multi-architecture support. **GitHub**: https://github.com/arvindand/maven-tools-mcp

starstarstarstarstar

6 months ago

Superjolt MCP Server - AI-Powered JavaScript Deployment Platform

scoritz

Superjolt is a Platform-as-a-Service (PaaS) that enables developers to deploy JavaScript applications instantly with a single command: npx superjolt deploy. Through its native Model Context Protocol (MCP) integration, developers can manage their entire deployment infrastructure using natural language via Claude Desktop. Instead of memorizing CLI commands or navigating complex dashboards, you can simply tell Claude what you want: "Show me all my running services," "Restart the API service," or "Set the database URL for production." Superjolt handles the complexities of deployment including automatic HTTPS, environment variables, real-time logs, and service management. The MCP integration transforms infrastructure management from a technical task into a conversation. Claude can access your deployment state, understand your infrastructure context, and execute complex workflows through simple commands. Whether you're deploying a new application, troubleshooting issues, or managing multiple environments, the MCP server provides Claude with the tools to list machines and services, start/stop/restart applications, manage environment variables, view logs, and handle the entire deployment lifecycle. This makes Superjolt ideal for developers who want to focus on building rather than configuring, offering the simplicity of serverless with the power of a full deployment platform—all controlled through natural language.

starstarstarstarstar

6 months ago