diff --git a/servers/mcp-filesystem/readme.md b/servers/mcp-filesystem/readme.md new file mode 100644 index 000000000..5ae0997d1 --- /dev/null +++ b/servers/mcp-filesystem/readme.md @@ -0,0 +1,199 @@ +# MCP Filesystem Server + +Advanced filesystem operations for AI agents with strict security boundaries. + +## Overview + +The MCP Filesystem Server provides AI agents with advanced file operations beyond basic read/write, including batch operations, directory watching, file search/indexing, and permission management, all within strict security boundaries. + +## Features + +### Core Capabilities + +- **Batch Operations**: Execute multiple file operations (copy, move, delete) atomically with automatic rollback on failure +- **Directory Watching**: Monitor filesystem changes in real-time with event filtering and recursive watching +- **File Search & Indexing**: Fast full-text search with metadata filtering using Lunr.js +- **Checksum Operations**: Compute and verify file integrity using MD5, SHA-1, SHA-256, or SHA-512 +- **Symlink Management**: Create and manage symbolic links within workspace boundaries +- **Disk Usage Analysis**: Analyze directory sizes, identify large files, and get file type breakdowns +- **Directory Operations**: Recursive copy, sync (only newer/missing files), and atomic file replacement + +### Security Features + +The server implements defense-in-depth security with 10 layers of path validation: + +1. **Absolute Path Resolution**: Prevents relative path tricks +2. **Workspace Boundary Check**: Ensures path is within workspace +3. **Path Traversal Detection**: Blocks `..` and `./` sequences +4. **System Path Blocklist**: Hardcoded system directories (cannot be overridden) +5. **Sensitive Pattern Blocklist**: Hardcoded sensitive files (cannot be overridden) +6. **Subdirectory Restrictions**: Optional allowlist within workspace +7. **User Blocklist**: Custom blocked paths +8. **User Pattern Blocklist**: Custom blocked patterns +9. **Read-Only Mode**: Prevents write/delete operations +10. **Symlink Validation**: Validates symlink targets are within workspace + +### Hardcoded Security (Cannot Be Disabled) + +**System Paths (Always Blocked):** + +- `/etc`, `/sys`, `/proc`, `/dev`, `/boot`, `/root` +- `C:\Windows`, `C:\Program Files` +- `/System`, `/Library`, `/Applications` (macOS) +- `/bin`, `/sbin`, `/usr/bin`, `/usr/sbin` + +**Sensitive Patterns (Always Blocked):** + +- `.ssh/`, `.aws/`, `.kube/` +- `id_rsa`, `*.pem`, `*.key`, `*.p12`, `*.pfx` +- Files containing: `password`, `secret`, `token` +- `.env` files + +## Installation + +### NPM + +```bash +npm install -g @ai-capabilities-suite/mcp-filesystem +``` + +### Docker + +```bash +docker pull digitaldefiance/mcp-filesystem:latest +``` + +## Configuration + +### Required Configuration + +Create a configuration file (e.g., `mcp-filesystem-config.json`): + +```json +{ + "workspaceRoot": "/path/to/your/workspace", + "blockedPaths": [".git", ".env", "node_modules"], + "blockedPatterns": ["*.key", "*.pem", "*.env"], + "maxFileSize": 104857600, + "maxBatchSize": 1073741824, + "maxOperationsPerMinute": 100, + "enableAuditLog": true, + "readOnly": false +} +``` + +### MCP Client Configuration + +Add to your MCP client configuration: + +```json +{ + "mcpServers": { + "filesystem": { + "command": "mcp-filesystem", + "args": ["--config", "/path/to/mcp-filesystem-config.json"] + } + } +} +``` + +## Available Tools + +The server exposes 12 MCP tools: + +1. **fs_batch_operations** - Execute multiple operations atomically +2. **fs_watch_directory** - Monitor directory for changes +3. **fs_get_watch_events** - Retrieve accumulated events +4. **fs_stop_watch** - Stop watch session +5. **fs_search_files** - Search by name, content, or metadata +6. **fs_build_index** - Build searchable file index +7. **fs_create_symlink** - Create symbolic link +8. **fs_compute_checksum** - Compute file checksum +9. **fs_verify_checksum** - Verify file integrity +10. **fs_analyze_disk_usage** - Analyze disk usage +11. **fs_copy_directory** - Recursively copy directory +12. **fs_sync_directory** - Sync directories (only newer/missing) + +## Usage Examples + +### Batch File Operations + +```typescript +{ + "operations": [ + { "type": "copy", "source": "file1.txt", "destination": "backup/file1.txt" }, + { "type": "move", "source": "temp.txt", "destination": "archive/temp.txt" }, + { "type": "delete", "source": "old.txt" } + ], + "atomic": true +} +``` + +### Watch Directory + +```typescript +{ + "path": "src", + "recursive": true, + "filters": ["*.ts", "*.js"] +} +``` + +### Search Files + +```typescript +{ + "query": "TODO", + "searchType": "content", + "fileTypes": [".ts", ".js"], + "useIndex": true +} +``` + +### Verify File Integrity + +```typescript +{ + "path": "important-file.zip", + "checksum": "abc123...", + "algorithm": "sha256" +} +``` + +## Security Warnings + +⚠️ **CRITICAL SECURITY CONSIDERATIONS** + +1. **Workspace Jail**: All operations are confined to the configured workspace root. This cannot be changed after server starts. + +2. **System Paths**: The server blocks access to system directories. This is hardcoded and cannot be disabled. + +3. **Sensitive Files**: The server blocks access to SSH keys, AWS credentials, and other sensitive files. This is hardcoded and cannot be disabled. + +4. **Rate Limiting**: Configure appropriate rate limits to prevent abuse. + +5. **Audit Logging**: Enable audit logging for security monitoring and forensics. + +6. **Read-Only Mode**: Consider using read-only mode for untrusted agents. + +## Documentation + +- **Full Documentation**: [GitHub Repository](https://github.com/Digital-Defiance/ai-capabilities-suite/tree/main/packages/mcp-filesystem) +- **Security Guide**: [SECURITY.md](https://github.com/Digital-Defiance/ai-capabilities-suite/blob/main/packages/mcp-filesystem/SECURITY.md) +- **Docker Guide**: [DOCKER.md](https://github.com/Digital-Defiance/ai-capabilities-suite/blob/main/packages/mcp-filesystem/DOCKER.md) +- **API Reference**: [README.md](https://github.com/Digital-Defiance/ai-capabilities-suite/blob/main/packages/mcp-filesystem/README.md) + +## Support + +- **Issues**: [GitHub Issues](https://github.com/Digital-Defiance/ai-capabilities-suite/issues) +- **Email**: info@digitaldefiance.org + +## License + +MIT License - see [LICENSE](https://github.com/Digital-Defiance/ai-capabilities-suite/blob/main/packages/mcp-filesystem/LICENSE) + +## Related Projects + +- [MCP Debugger](https://github.com/Digital-Defiance/ai-capabilities-suite/tree/main/packages/mcp-debugger-server) - Debug Node.js applications via MCP +- [MCP Process](https://github.com/Digital-Defiance/ai-capabilities-suite/tree/main/packages/mcp-process) - Process management via MCP +- [MCP Screenshot](https://github.com/Digital-Defiance/ai-capabilities-suite/tree/main/packages/mcp-screenshot) - Screenshot capture via MCP diff --git a/servers/mcp-filesystem/server.yaml b/servers/mcp-filesystem/server.yaml new file mode 100644 index 000000000..f7c36fb90 --- /dev/null +++ b/servers/mcp-filesystem/server.yaml @@ -0,0 +1,136 @@ +name: mcp-filesystem +image: mcp/mcp-filesystem +type: server +meta: + category: devops + tags: + - filesystem + - file-operations + - batch-operations + - directory-watching + - file-search + - security + - devops + - ai-capabilities +about: + title: ACS Filesystem + description: | + Advanced filesystem operations for AI agents with strict security boundaries. + + Features: + - Batch Operations: Execute multiple file operations atomically with rollback + - Directory Watching: Monitor filesystem changes in real-time with event filtering + - File Search & Indexing: Fast full-text search with metadata filtering + - Checksum Operations: Compute and verify file integrity (MD5, SHA-1, SHA-256, SHA-512) + - Symlink Management: Create and manage symbolic links within workspace + - Disk Usage Analysis: Analyze directory sizes and identify large files + - Directory Operations: Recursive copy, sync, and atomic file replacement + + Security Features: + - 10-layer path validation + - Workspace boundary enforcement + - System path blocking (hardcoded, cannot be disabled) + - Sensitive file pattern blocking (hardcoded, cannot be disabled) + - Rate limiting + - Audit logging + + ⚠️ SECURITY WARNING: This server provides powerful filesystem operations. + Always configure strict security boundaries and review the SECURITY.md documentation. + icon: https://www.google.com/s2/favicons?domain=digitaldefiance.org&sz=64 +source: + project: https://github.com/Digital-Defiance/mcp-filesystem + branch: main + commit: 29e0fac10ce64449a70c8189b619afb7b443d0e0 +run: + command: + - "{{filesystem.workspaceRoot|volume-target|into}}" + volumes: + - "{{filesystem.workspaceRoot|volume|into}}" + - "{{filesystem.configPath|volume|into}}" + disableNetwork: true +config: + description: | + Configure the MCP ACS Filesystem server with security boundaries and operational limits. + + REQUIRED: workspaceRoot - All operations are confined to this directory + + Security Configuration: + - allowedSubdirectories: Further restrict operations to specific subdirectories + - blockedPaths: Additional paths to block (relative to workspace) + - blockedPatterns: Regex patterns to block + + Resource Limits: + - maxFileSize: Maximum file size in bytes (default: 100MB) + - maxBatchSize: Maximum total size for batch operations (default: 1GB) + - maxOperationsPerMinute: Rate limit per agent (default: 100) + + Operational Settings: + - enableAuditLog: Enable operation logging (default: true) + - requireConfirmation: Require confirmation for destructive operations (default: true) + - readOnly: Enable read-only mode (default: false) + + See SECURITY.md for complete security documentation. + parameters: + type: object + properties: + workspaceRoot: + type: string + description: Absolute path to workspace directory (REQUIRED) - all operations confined to this directory + default: /app/workspace + configPath: + type: string + description: Path to configuration file + default: /app/config/mcp-filesystem-config.json + allowedSubdirectories: + type: array + items: + type: string + description: Optional subdirectories within workspace to restrict operations to + default: [] + blockedPaths: + type: array + items: + type: string + description: Additional paths to block (relative to workspace) + default: + - .git + - .env + - node_modules + - .ssh + blockedPatterns: + type: array + items: + type: string + description: Regex patterns to block + default: + - "*.key" + - "*.pem" + - "*.env" + - "*secret*" + - "*password*" + maxFileSize: + type: integer + description: Maximum file size in bytes + default: 104857600 + maxBatchSize: + type: integer + description: Maximum total size for batch operations in bytes + default: 1073741824 + maxOperationsPerMinute: + type: integer + description: Rate limit per agent + default: 100 + enableAuditLog: + type: boolean + description: Enable operation logging + default: true + requireConfirmation: + type: boolean + description: Require confirmation for destructive operations + default: true + readOnly: + type: boolean + description: Enable read-only mode + default: false + required: + - workspaceRoot diff --git a/servers/mcp-filesystem/tools.json b/servers/mcp-filesystem/tools.json new file mode 100644 index 000000000..fb5439056 --- /dev/null +++ b/servers/mcp-filesystem/tools.json @@ -0,0 +1,439 @@ +[ + { + "name": "fs_batch_operations", + "displayName": "Batch File Operations", + "description": "Execute multiple filesystem operations (copy, move, delete) atomically with automatic rollback on failure", + "category": "batch", + "inputSchema": { + "type": "object", + "properties": { + "operations": { + "type": "array", + "description": "Array of operations to execute", + "items": { + "type": "object", + "properties": { + "type": { + "type": "string", + "enum": ["copy", "move", "delete"], + "description": "Operation type" + }, + "source": { + "type": "string", + "description": "Source file path (relative to workspace)" + }, + "destination": { + "type": "string", + "description": "Destination file path (required for copy/move)" + } + }, + "required": ["type", "source"] + } + }, + "atomic": { + "type": "boolean", + "description": "If true, rollback all operations on any failure", + "default": true + } + }, + "required": ["operations"] + }, + "examples": [ + { + "name": "Atomic batch copy", + "input": { + "operations": [ + { + "type": "copy", + "source": "file1.txt", + "destination": "backup/file1.txt" + }, + { + "type": "copy", + "source": "file2.txt", + "destination": "backup/file2.txt" + } + ], + "atomic": true + } + } + ] + }, + { + "name": "fs_watch_directory", + "displayName": "Watch Directory", + "description": "Monitor a directory for filesystem changes (create, modify, delete, rename) with optional filtering", + "category": "watching", + "inputSchema": { + "type": "object", + "properties": { + "path": { + "type": "string", + "description": "Directory path to watch (relative to workspace)" + }, + "recursive": { + "type": "boolean", + "description": "Watch subdirectories recursively", + "default": false + }, + "filters": { + "type": "array", + "items": { "type": "string" }, + "description": "Glob patterns to filter events (e.g., ['*.ts', '*.js'])" + } + }, + "required": ["path"] + }, + "examples": [ + { + "name": "Watch source directory", + "input": { + "path": "src", + "recursive": true, + "filters": ["*.ts", "*.tsx"] + } + } + ] + }, + { + "name": "fs_get_watch_events", + "displayName": "Get Watch Events", + "description": "Retrieve accumulated filesystem events from an active watch session", + "category": "watching", + "inputSchema": { + "type": "object", + "properties": { + "sessionId": { + "type": "string", + "description": "Watch session ID from fs_watch_directory" + } + }, + "required": ["sessionId"] + }, + "examples": [ + { + "name": "Get events", + "input": { + "sessionId": "abc-123" + } + } + ] + }, + { + "name": "fs_stop_watch", + "displayName": "Stop Watch", + "description": "Stop an active directory watch session and clean up resources", + "category": "watching", + "inputSchema": { + "type": "object", + "properties": { + "sessionId": { + "type": "string", + "description": "Watch session ID to stop" + } + }, + "required": ["sessionId"] + }, + "examples": [ + { + "name": "Stop watch", + "input": { + "sessionId": "abc-123" + } + } + ] + }, + { + "name": "fs_search_files", + "displayName": "Search Files", + "description": "Search for files by name, content, or metadata with optional indexing for fast retrieval", + "category": "search", + "inputSchema": { + "type": "object", + "properties": { + "query": { + "type": "string", + "description": "Search query string" + }, + "searchType": { + "type": "string", + "enum": ["name", "content", "both"], + "description": "Search by filename, content, or both", + "default": "name" + }, + "fileTypes": { + "type": "array", + "items": { "type": "string" }, + "description": "File extensions to filter (e.g., ['.ts', '.js'])" + }, + "minSize": { + "type": "number", + "description": "Minimum file size in bytes" + }, + "maxSize": { + "type": "number", + "description": "Maximum file size in bytes" + }, + "modifiedAfter": { + "type": "string", + "description": "ISO date string - only files modified after this date" + }, + "useIndex": { + "type": "boolean", + "description": "Use file index for faster search", + "default": true + } + }, + "required": ["query"] + }, + "examples": [ + { + "name": "Search for TODO comments", + "input": { + "query": "TODO", + "searchType": "content", + "fileTypes": [".ts", ".js"], + "useIndex": true + } + } + ] + }, + { + "name": "fs_build_index", + "displayName": "Build File Index", + "description": "Build a searchable index of files for fast searching across large codebases", + "category": "search", + "inputSchema": { + "type": "object", + "properties": { + "path": { + "type": "string", + "description": "Directory path to index (relative to workspace)" + }, + "includeContent": { + "type": "boolean", + "description": "Index file contents (text files only)", + "default": false + } + }, + "required": ["path"] + }, + "examples": [ + { + "name": "Index source directory", + "input": { + "path": "src", + "includeContent": true + } + } + ] + }, + { + "name": "fs_create_symlink", + "displayName": "Create Symlink", + "description": "Create a symbolic link within the workspace (target must be within workspace)", + "category": "links", + "inputSchema": { + "type": "object", + "properties": { + "linkPath": { + "type": "string", + "description": "Path where symlink will be created" + }, + "targetPath": { + "type": "string", + "description": "Path the symlink points to (must be within workspace)" + } + }, + "required": ["linkPath", "targetPath"] + }, + "examples": [ + { + "name": "Create symlink", + "input": { + "linkPath": "current", + "targetPath": "releases/v1.0.0" + } + } + ] + }, + { + "name": "fs_compute_checksum", + "displayName": "Compute Checksum", + "description": "Compute file checksum for integrity verification using MD5, SHA-1, SHA-256, or SHA-512", + "category": "integrity", + "inputSchema": { + "type": "object", + "properties": { + "path": { + "type": "string", + "description": "File path (relative to workspace)" + }, + "algorithm": { + "type": "string", + "enum": ["md5", "sha1", "sha256", "sha512"], + "description": "Hash algorithm", + "default": "sha256" + } + }, + "required": ["path"] + }, + "examples": [ + { + "name": "Compute SHA-256", + "input": { + "path": "important-file.zip", + "algorithm": "sha256" + } + } + ] + }, + { + "name": "fs_verify_checksum", + "displayName": "Verify Checksum", + "description": "Verify a file's checksum matches the expected value", + "category": "integrity", + "inputSchema": { + "type": "object", + "properties": { + "path": { + "type": "string", + "description": "File path (relative to workspace)" + }, + "checksum": { + "type": "string", + "description": "Expected checksum hex string" + }, + "algorithm": { + "type": "string", + "enum": ["md5", "sha1", "sha256", "sha512"], + "description": "Hash algorithm used", + "default": "sha256" + } + }, + "required": ["path", "checksum"] + }, + "examples": [ + { + "name": "Verify file integrity", + "input": { + "path": "important-file.zip", + "checksum": "abc123...", + "algorithm": "sha256" + } + } + ] + }, + { + "name": "fs_analyze_disk_usage", + "displayName": "Analyze Disk Usage", + "description": "Analyze disk usage, identify large files/directories, and get file type breakdowns", + "category": "analysis", + "inputSchema": { + "type": "object", + "properties": { + "path": { + "type": "string", + "description": "Directory path to analyze (relative to workspace)" + }, + "depth": { + "type": "number", + "description": "Maximum depth to traverse", + "default": -1 + }, + "groupByType": { + "type": "boolean", + "description": "Group results by file extension", + "default": false + } + }, + "required": ["path"] + }, + "examples": [ + { + "name": "Analyze project directory", + "input": { + "path": ".", + "depth": 3, + "groupByType": true + } + } + ] + }, + { + "name": "fs_copy_directory", + "displayName": "Copy Directory", + "description": "Recursively copy a directory with options to preserve metadata and exclude patterns", + "category": "directory", + "inputSchema": { + "type": "object", + "properties": { + "source": { + "type": "string", + "description": "Source directory path" + }, + "destination": { + "type": "string", + "description": "Destination directory path" + }, + "preserveMetadata": { + "type": "boolean", + "description": "Preserve timestamps and permissions", + "default": false + }, + "exclusions": { + "type": "array", + "items": { "type": "string" }, + "description": "Glob patterns to exclude (e.g., ['*.test.ts', 'node_modules/**'])" + } + }, + "required": ["source", "destination"] + }, + "examples": [ + { + "name": "Copy with exclusions", + "input": { + "source": "src", + "destination": "backup", + "preserveMetadata": true, + "exclusions": ["*.test.ts", "node_modules/**"] + } + } + ] + }, + { + "name": "fs_sync_directory", + "displayName": "Sync Directory", + "description": "Sync directories by copying only files that are newer or missing in the destination", + "category": "directory", + "inputSchema": { + "type": "object", + "properties": { + "source": { + "type": "string", + "description": "Source directory path" + }, + "destination": { + "type": "string", + "description": "Destination directory path" + }, + "exclusions": { + "type": "array", + "items": { "type": "string" }, + "description": "Glob patterns to exclude" + } + }, + "required": ["source", "destination"] + }, + "examples": [ + { + "name": "Sync to backup", + "input": { + "source": "src", + "destination": "backup", + "exclusions": ["*.test.ts"] + } + } + ] + } +]