Compare commits

..

2 Commits

Author SHA1 Message Date
cd840cb8ca restructure of dirs, huge docs update 2025-11-17 16:29:14 -06:00
456e052389 updating docs 2025-11-17 15:50:15 -06:00
95 changed files with 4257 additions and 10908 deletions

3
.gitignore vendored
View File

@@ -37,3 +37,6 @@ Thumbs.db
# Docker
.dockerignore
#mounted dirs
configs/

View File

@@ -23,8 +23,8 @@ RUN git clone https://github.com/robertdavidgraham/masscan /tmp/masscan && \
WORKDIR /app
# Copy requirements and install Python dependencies
COPY requirements.txt .
COPY requirements-web.txt .
COPY app/requirements.txt .
COPY app/requirements-web.txt .
RUN pip install --no-cache-dir -r requirements.txt && \
pip install --no-cache-dir -r requirements-web.txt
@@ -33,12 +33,12 @@ RUN pip install --no-cache-dir -r requirements.txt && \
RUN playwright install chromium
# Copy application code
COPY src/ ./src/
COPY templates/ ./templates/
COPY web/ ./web/
COPY migrations/ ./migrations/
COPY alembic.ini .
COPY init_db.py .
COPY app/src/ ./src/
COPY app/templates/ ./templates/
COPY app/web/ ./web/
COPY app/migrations/ ./migrations/
COPY app/alembic.ini .
COPY app/init_db.py .
# Create required directories
RUN mkdir -p /app/output /app/logs

862
README.md
View File

@@ -1,28 +1,26 @@
# SneakyScanner
A comprehensive network scanning and infrastructure monitoring platform with both CLI and web interfaces. SneakyScanner uses masscan for fast port discovery, nmap for service detection, sslyze for SSL/TLS analysis, and Playwright for webpage screenshots to perform comprehensive infrastructure audits.
A comprehensive network scanning and infrastructure monitoring platform with web interface and CLI scanner. SneakyScanner uses masscan for fast port discovery, nmap for service detection, sslyze for SSL/TLS analysis, and Playwright for webpage screenshots to perform comprehensive infrastructure audits.
**Features:**
- 🔍 **CLI Scanner** - Standalone scanning tool with YAML-based configuration
- 🌐 **Web Application** - Flask-based web UI with REST API for scan management
- 📊 **Database Storage** - SQLite database for scan history and trend analysis
- ⏱️ **Background Jobs** - Asynchronous scan execution with APScheduler
- 🔐 **Authentication** - Secure session-based authentication system
- 📈 **Historical Data** - Track infrastructure changes over time
**Primary Interface**: Web Application (Flask-based GUI)
**Alternative**: Standalone CLI Scanner (for testing and CI/CD)
## Table of Contents
---
1. [Quick Start](#quick-start)
- [Web Application (Recommended)](#web-application-recommended)
- [CLI Scanner (Standalone)](#cli-scanner-standalone)
2. [Features](#features)
3. [Web Application](#web-application)
4. [CLI Scanner](#cli-scanner)
5. [Configuration](#configuration)
6. [Output Formats](#output-formats)
7. [API Documentation](#api-documentation)
8. [Deployment](#deployment)
9. [Development](#development)
## Key Features
- 🌐 **Web Dashboard** - Modern web UI for scan management, scheduling, and historical analysis
- 📊 **Database Storage** - SQLite-based scan history with trend analysis and comparison
-**Scheduled Scans** - Cron-based automated scanning with APScheduler
- 🔧 **Config Creator** - CIDR-to-YAML configuration builder for quick setup
- 🔍 **Network Discovery** - Fast port scanning with masscan (all 65535 ports, TCP/UDP)
- 🎯 **Service Detection** - Nmap-based service enumeration with version detection
- 🔒 **SSL/TLS Analysis** - Certificate extraction, TLS version testing, cipher suite analysis
- 📸 **Screenshot Capture** - Automated webpage screenshots for all discovered web services
- 📈 **Drift Detection** - Expected vs. actual infrastructure comparison
- 📋 **Multi-Format Reports** - JSON, HTML, and ZIP archives with visual reports
- 🔐 **Authentication** - Session-based login for single-user deployments
- 🔔 **Alerts** *(Phase 5 - Coming Soon)* - Email and webhook notifications for misconfigurations
---
@@ -30,764 +28,148 @@ A comprehensive network scanning and infrastructure monitoring platform with bot
### Web Application (Recommended)
The web application provides a complete interface for managing scans, viewing history, and analyzing results.
1. **Configure environment:**
```bash
# Copy example environment file
# 1. Clone repository
git clone <repository-url>
cd SneakyScan
# 2. Configure environment
cp .env.example .env
# Edit .env and set SECRET_KEY and SNEAKYSCANNER_ENCRYPTION_KEY
# Generate secure keys (Linux/Mac)
export SECRET_KEY=$(python3 -c 'import secrets; print(secrets.token_hex(32))')
export ENCRYPTION_KEY=$(python3 -c 'import secrets; print(secrets.token_urlsafe(32))')
# 3. Build and start
docker compose build
docker compose up -d
# Update .env file with generated keys
sed -i "s/your-secret-key-here/$SECRET_KEY/" .env
sed -i "s/your-encryption-key-here/$ENCRYPTION_KEY/" .env
# 4. Initialize database
docker compose run --rm init-db --password "YourSecurePassword"
# 5. Access web interface
# Open http://localhost:5000
```
2. **Start the web application:**
```bash
docker-compose -f docker-compose-web.yml up -d
```
3. **Access the web interface:**
- Open http://localhost:5000 in your browser
- Default password: `admin` (change immediately after first login)
4. **Trigger your first scan:**
- Click "Run Scan Now" on the dashboard
- Or use the API:
```bash
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/example-site.yaml"}' \
-b cookies.txt
```
See [Deployment Guide](docs/ai/DEPLOYMENT.md) for detailed setup instructions.
**See [Deployment Guide](docs/DEPLOYMENT.md) for detailed setup instructions.**
### CLI Scanner (Standalone)
For quick one-off scans or scripting, use the standalone CLI scanner:
For quick one-off scans without the web interface:
```bash
# Build the image
docker-compose build
# Build and run
docker compose -f docker-compose-standalone.yml build
docker compose -f docker-compose-standalone.yml up
# Run a scan
docker-compose up
# Or run directly
docker run --rm --privileged --network host \
-v $(pwd)/configs:/app/configs:ro \
-v $(pwd)/output:/app/output \
sneakyscanner /app/configs/example-site.yaml
# Results saved to ./output/
```
Results are saved to the `output/` directory as JSON, HTML, and ZIP files.
**See [CLI Scanning Guide](docs/CLI_SCANNING.md) for detailed usage.**
---
## Features
## Documentation
### Web Application (Phase 2)
### User Guides
- **[Deployment Guide](docs/DEPLOYMENT.md)** - Installation, configuration, and production deployment
- **[CLI Scanning Guide](docs/CLI_SCANNING.md)** - Standalone scanner usage, configuration, and output formats
- **[API Reference](docs/API_REFERENCE.md)** - Complete REST API documentation
- **Dashboard** - View scan history, statistics, and recent activity
- **REST API** - Programmatic access to all scan management functions
- **Background Jobs** - Scans execute asynchronously without blocking
- **Database Storage** - Complete scan history with queryable data
- **Authentication** - Secure session-based login system
- **Pagination** - Efficiently browse large scan datasets
- **Status Tracking** - Real-time scan progress monitoring
- **Error Handling** - Comprehensive error logging and reporting
### Network Discovery & Port Scanning
- **YAML-based configuration** for defining scan targets and expectations
- **Comprehensive scanning using masscan**:
- Ping/ICMP echo detection (masscan --ping)
- TCP port scanning (all 65535 ports at 10,000 pps)
- UDP port scanning (all 65535 ports at 10,000 pps)
- Fast network-wide discovery in seconds
### Service Detection & Enumeration
- **Service detection using nmap**:
- Identifies services running on discovered TCP ports
- Extracts product names and versions (e.g., "OpenSSH 8.2p1", "nginx 1.18.0")
- Provides detailed service information including extra attributes
- Balanced intensity level (5) for accuracy and speed
### Security Assessment
- **HTTP/HTTPS analysis and SSL/TLS security assessment**:
- Detects HTTP vs HTTPS on web services
- Extracts SSL certificate details (subject, issuer, expiration, SANs)
- Calculates days until certificate expiration for monitoring
- Tests TLS version support (TLS 1.0, 1.1, 1.2, 1.3)
- Lists all accepted cipher suites for each supported TLS version
- Identifies weak cryptographic configurations
### Visual Documentation
- **Webpage screenshot capture** (NEW):
- Automatically captures screenshots of all discovered web services (HTTP/HTTPS)
- Uses Playwright with headless Chromium browser
- Viewport screenshots (1280x720) for consistent sizing
- 15-second timeout per page with graceful error handling
- Handles self-signed certificates without errors
- Saves screenshots as PNG files with references in JSON reports
- Screenshots organized in timestamped directories
- Browser reuse for optimal performance
### Reporting & Output
- **Automatic multi-format output** after each scan:
- Machine-readable JSON reports for post-processing
- Human-readable HTML reports with dark theme
- ZIP archives containing all outputs for easy sharing
- **HTML report features**:
- Comprehensive reports with dark theme for easy reading
- Summary dashboard with scan statistics, drift alerts, and security warnings
- Site-by-site breakdown with expandable service details
- Visual badges for expected vs. unexpected services
- SSL/TLS certificate details with expiration warnings
- Automatically generated after every scan
- **Dockerized** for consistent execution environment and root privilege isolation
- **Expected vs. Actual comparison** to identify infrastructure drift
- Timestamped reports with complete scan duration metrics
### Developer Resources
- **[Roadmap](docs/ROADMAP.md)** - Project roadmap, architecture, and planned features
---
## Web Application
## Current Status
### Overview
**Latest Version**: Phase 4 Complete ✅
**Last Updated**: 2025-11-17
The SneakyScanner web application provides a Flask-based interface for managing network scans. All scans are stored in a SQLite database, enabling historical analysis and trending.
### Completed Phases
### Key Features
-**Phase 1**: Database schema, SQLAlchemy models, settings system
-**Phase 2**: REST API, background jobs, authentication, web UI
-**Phase 3**: Dashboard, scheduling, trend charts
-**Phase 4**: Config creator, YAML editor, config management UI
**Scan Management:**
- Trigger scans via web UI or REST API
- View complete scan history with pagination
- Monitor real-time scan status
- Delete scans and associated files
### Next Up: Phase 5 - Email, Webhooks & Comparisons
**REST API:**
- Full CRUD operations for scans
- Session-based authentication
- JSON responses for all endpoints
- Comprehensive error handling
**Core Use Case**: Monitor infrastructure for misconfigurations that expose unexpected ports/services. When a scan detects an open port not in the config's `expected_ports` list, trigger immediate notifications.
**Background Processing:**
- APScheduler for async scan execution
- Up to 3 concurrent scans (configurable)
- Status tracking: `running``completed`/`failed`
- Error capture and logging
**Planned Features**:
- Email notifications for infrastructure changes
- Webhook integrations (Slack, PagerDuty, custom SIEM)
- Alert rule engine (unexpected ports, cert expiry, weak TLS)
- Scan comparison reports for drift detection
**Database Schema:**
- 11 normalized tables for scan data
- Relationships: Scans → Sites → IPs → Ports → Services → Certificates → TLS Versions
- Efficient queries with indexes
- SQLite WAL mode for better concurrency
### Web UI Routes
| Route | Description |
|-------|-------------|
| `/` | Redirects to dashboard |
| `/login` | Login page |
| `/logout` | Logout and destroy session |
| `/dashboard` | Main dashboard with stats and recent scans |
| `/scans` | Browse scan history (paginated) |
| `/scans/<id>` | View detailed scan results |
### API Endpoints
See [API_REFERENCE.md](docs/ai/API_REFERENCE.md) for complete API documentation.
**Core Endpoints:**
- `POST /api/scans` - Trigger new scan
- `GET /api/scans` - List scans (paginated, filterable)
- `GET /api/scans/{id}` - Get scan details
- `GET /api/scans/{id}/status` - Poll scan status
- `DELETE /api/scans/{id}` - Delete scan and files
**Settings Endpoints:**
- `GET /api/settings` - Get all settings
- `PUT /api/settings/{key}` - Update setting
- `GET /api/settings/health` - Health check
### Authentication
**Login:**
```bash
curl -X POST http://localhost:5000/auth/login \
-H "Content-Type: application/json" \
-d '{"password":"yourpassword"}' \
-c cookies.txt
```
**Use session for API calls:**
```bash
curl -X GET http://localhost:5000/api/scans \
-b cookies.txt
```
**Change password:**
1. Login to web UI
2. Navigate to Settings
3. Update app password
4. Or use CLI: `python3 web/utils/change_password.py`
See [Roadmap](docs/ROADMAP.md) for complete feature timeline.
---
## CLI Scanner
## Architecture
### Requirements
- Docker
- Docker Compose (optional, for easier usage)
### Using Docker Compose
1. Create or modify a configuration file in `configs/`:
```yaml
title: "My Infrastructure Scan"
sites:
- name: "Web Servers"
ips:
- address: "192.168.1.10"
expected:
ping: true
tcp_ports: [22, 80, 443]
udp_ports: []
```
┌─────────────────────────────────────────────────────────────┐
│ Flask Web Application │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────────┐ │
│ │ Web UI │ │ REST API │ │ Scheduler │ │
│ │ (Dashboard) │ │ (JSON/CRUD) │ │ (APScheduler) │ │
│ └──────┬───────┘ └──────┬───────┘ └────────┬─────────┘ │
│ │ │ │ │
│ └─────────────────┴────────────────────┘ │
│ │ │
│ ┌────────▼────────┐ │
│ │ SQLAlchemy │ │
│ │ (ORM Layer) │ │
│ └────────┬────────┘ │
│ │
┌────────▼────────┐ │
│ SQLite3 DB │ │
│ (scan history) │ │
└─────────────────┘ │
└───────────────────────────┬─────────────────────────────────┘
┌──────────▼──────────┐
│ Scanner Engine │
│ (scanner.py) │
│ ┌────────────────┐ │
│ │ Masscan/Nmap │ │
│ │ Playwright │ │
│ │ sslyze │ │
│ └────────────────┘ │
└─────────────────────┘
```
2. Build and run:
```bash
docker-compose build
docker-compose up
```
3. Check results in the `output/` directory:
- `scan_report_YYYYMMDD_HHMMSS.json` - JSON report
- `scan_report_YYYYMMDD_HHMMSS.html` - HTML report
- `scan_report_YYYYMMDD_HHMMSS.zip` - ZIP archive
- `scan_report_YYYYMMDD_HHMMSS_screenshots/` - Screenshots directory
## Scan Performance
SneakyScanner uses a five-phase approach for comprehensive scanning:
1. **Ping Scan** (masscan): ICMP echo detection - ~1-2 seconds
2. **TCP Port Discovery** (masscan): Scans all 65535 TCP ports at 10,000 packets/second - ~13 seconds per 2 IPs
3. **UDP Port Discovery** (masscan): Scans all 65535 UDP ports at 10,000 packets/second - ~13 seconds per 2 IPs
4. **Service Detection** (nmap): Identifies services on discovered TCP ports - ~20-60 seconds per IP with open ports
5. **HTTP/HTTPS Analysis** (Playwright, SSL/TLS): Detects web protocols, captures screenshots, and analyzes certificates - ~10-20 seconds per web service
**Example**: Scanning 2 IPs with 10 open ports each (including 2-3 web services) typically takes 2-3 minutes total.
### Using Docker Directly
1. Build the image:
```bash
docker build -t sneakyscanner .
```
2. Run a scan:
```bash
docker run --rm --privileged --network host \
-v $(pwd)/configs:/app/configs:ro \
-v $(pwd)/output:/app/output \
sneakyscanner /app/configs/your-config.yaml
```
**Technology Stack**:
- **Backend**: Flask 3.x, SQLAlchemy 2.x, SQLite3, APScheduler 3.x
- **Frontend**: Jinja2, Bootstrap 5, Chart.js, Vanilla JavaScript
- **Scanner**: Masscan, Nmap, Playwright (Chromium), sslyze
- **Deployment**: Docker Compose, Gunicorn
---
## Configuration
The YAML configuration file defines the scan parameters:
```yaml
title: "Scan Title" # Required: Report title
sites: # Required: List of sites to scan
- name: "Site Name"
ips:
- address: "192.168.1.10"
expected:
ping: true # Expected ping response
tcp_ports: [22, 80] # Expected TCP ports
udp_ports: [53] # Expected UDP ports
```
See `configs/example-site.yaml` for a complete example.
---
## Output Formats
After each scan completes, SneakyScanner automatically generates three output formats:
1. **JSON Report** (`scan_report_YYYYMMDD_HHMMSS.json`): Machine-readable scan data with all discovered services, ports, and SSL/TLS information
2. **HTML Report** (`scan_report_YYYYMMDD_HHMMSS.html`): Human-readable report with dark theme, summary dashboard, and detailed service breakdown
3. **ZIP Archive** (`scan_report_YYYYMMDD_HHMMSS.zip`): Contains JSON report, HTML report, and all screenshots for easy sharing and archival
All files share the same timestamp for easy correlation. Screenshots are saved in a subdirectory (`scan_report_YYYYMMDD_HHMMSS_screenshots/`) and included in the ZIP archive. The report includes the total scan duration (in seconds) covering all phases: ping scan, TCP/UDP port discovery, service detection, screenshot capture, and report generation.
```json
{
"title": "Sneaky Infra Scan",
"scan_time": "2024-01-15T10:30:00Z",
"scan_duration": 95.3,
"config_file": "/app/configs/example-site.yaml",
"sites": [
{
"name": "Production Web Servers",
"ips": [
{
"address": "192.168.1.10",
"expected": {
"ping": true,
"tcp_ports": [22, 80, 443],
"udp_ports": [53]
},
"actual": {
"ping": true,
"tcp_ports": [22, 80, 443, 3000],
"udp_ports": [53],
"services": [
{
"port": 22,
"protocol": "tcp",
"service": "ssh",
"product": "OpenSSH",
"version": "8.2p1"
},
{
"port": 80,
"protocol": "tcp",
"service": "http",
"product": "nginx",
"version": "1.18.0",
"http_info": {
"protocol": "http",
"screenshot": "scan_report_20250115_103000_screenshots/192_168_1_10_80.png"
}
},
{
"port": 443,
"protocol": "tcp",
"service": "https",
"product": "nginx",
"http_info": {
"protocol": "https",
"screenshot": "scan_report_20250115_103000_screenshots/192_168_1_10_443.png",
"ssl_tls": {
"certificate": {
"subject": "CN=example.com",
"issuer": "CN=Let's Encrypt Authority X3,O=Let's Encrypt,C=US",
"serial_number": "123456789012345678901234567890",
"not_valid_before": "2025-01-01T00:00:00+00:00",
"not_valid_after": "2025-04-01T23:59:59+00:00",
"days_until_expiry": 89,
"sans": ["example.com", "www.example.com"]
},
"tls_versions": {
"TLS 1.0": {
"supported": false,
"cipher_suites": []
},
"TLS 1.1": {
"supported": false,
"cipher_suites": []
},
"TLS 1.2": {
"supported": true,
"cipher_suites": [
"TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384",
"TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256"
]
},
"TLS 1.3": {
"supported": true,
"cipher_suites": [
"TLS_AES_256_GCM_SHA384",
"TLS_AES_128_GCM_SHA256"
]
}
}
}
}
},
{
"port": 3000,
"protocol": "tcp",
"service": "http",
"product": "Node.js",
"http_info": {
"protocol": "http"
}
}
]
}
}
]
}
]
}
```
## Screenshot Capture Details
SneakyScanner automatically captures webpage screenshots for all discovered HTTP and HTTPS services, providing visual documentation of your infrastructure.
### How It Works
1. **Automatic Detection**: During the HTTP/HTTPS analysis phase, SneakyScanner identifies web services based on:
- Nmap service detection results (http, https, ssl, http-proxy)
- Common web ports (80, 443, 8000, 8006, 8080, 8081, 8443, 8888, 9443)
2. **Screenshot Capture**: For each web service:
- Launches headless Chromium browser (once per scan, reused for all screenshots)
- Navigates to the service URL (HTTP or HTTPS)
- Waits for network to be idle (up to 15 seconds)
- Captures viewport screenshot (1280x720 pixels)
- Handles SSL certificate errors gracefully (e.g., self-signed certificates)
3. **Storage**: Screenshots are saved as PNG files:
- Directory: `output/scan_report_YYYYMMDD_HHMMSS_screenshots/`
- Filename format: `{ip}_{port}.png` (e.g., `192_168_1_10_443.png`)
- Referenced in JSON report under `http_info.screenshot`
### Screenshot Configuration
Default settings (configured in `src/screenshot_capture.py`):
- **Viewport size**: 1280x720 (captures visible area only, not full page)
- **Timeout**: 15 seconds per page load
- **Browser**: Chromium (headless mode)
- **SSL handling**: Ignores HTTPS errors (works with self-signed certificates)
- **User agent**: Mozilla/5.0 (Windows NT 10.0; Win64; x64)
### Error Handling
Screenshots are captured on a best-effort basis:
- If a screenshot fails (timeout, connection error, etc.), the scan continues
- Failed screenshots are logged but don't stop the scan
- Services without screenshots simply omit the `screenshot` field in JSON output
## HTML Report Generation
SneakyScanner automatically generates comprehensive HTML reports after each scan, providing an easy-to-read visual interface for analyzing scan results.
### Automatic Generation
HTML reports are automatically created after every scan completes, along with JSON reports and ZIP archives. All three outputs share the same timestamp and are saved to the `output/` directory.
### Manual Generation (Optional)
You can also manually generate HTML reports from existing JSON scan data:
```bash
# Generate HTML report (creates report in same directory as JSON)
python3 src/report_generator.py output/scan_report_20251113_175235.json
# Specify custom output path
python3 src/report_generator.py output/scan_report.json /path/to/custom_report.html
```
### Report Features
The generated HTML report includes:
**Summary Dashboard**:
- **Scan Statistics**: Total IPs scanned, TCP/UDP ports found, services identified, web services, screenshots captured
- **Drift Alerts**: Unexpected TCP/UDP ports, missing expected services, new services detected
- **Security Warnings**: Expiring certificates (<30 days), weak TLS versions (1.0/1.1), self-signed certificates, high port services (>10000)
**Site-by-Site Breakdown**:
- Organized by logical site grouping from configuration
- Per-IP sections with status badges (ping, port drift summary)
- Service tables with expandable details (click any row to expand)
- Visual badges: green (expected), red (unexpected), yellow (missing/warning)
**Service Details** (click to expand):
- Product name, version, extra information, OS type
- HTTP/HTTPS protocol detection
- Screenshot links for web services
- SSL/TLS certificate details (expandable):
- Subject, issuer, validity dates, serial number
- Days until expiration with color-coded warnings
- Subject Alternative Names (SANs)
- TLS version support (1.0, 1.1, 1.2, 1.3) with cipher suites
- Weak TLS and self-signed certificate warnings
**UDP Port Handling**:
- Expected UDP ports shown with green "Expected" badge
- Unexpected UDP ports shown with red "Unexpected" badge
- Missing expected UDP ports shown with yellow "Missing" badge
- Note: Service detection not available for UDP (nmap limitation)
**Design**:
- Dark theme with slate/grey color scheme for comfortable reading
- Responsive layout works on different screen sizes
- No external dependencies - single HTML file
- Minimal JavaScript for expand/collapse functionality
- Optimized hover effects for table rows
### Report Output
The HTML report is a standalone file that can be:
- Opened directly in any web browser (Chrome, Firefox, Safari, Edge)
- Shared via email or file transfer
- Archived for compliance or historical comparison
- Viewed without an internet connection or web server
Screenshot links in the report are relative paths, so keep the report and screenshot directory together.
---
## API Documentation
Complete API reference available at [docs/ai/API_REFERENCE.md](docs/ai/API_REFERENCE.md).
**Quick Reference:**
| Endpoint | Method | Description |
|----------|--------|-------------|
| `/api/scans` | POST | Trigger new scan |
| `/api/scans` | GET | List all scans (paginated) |
| `/api/scans/{id}` | GET | Get scan details |
| `/api/scans/{id}/status` | GET | Get scan status |
| `/api/scans/{id}` | DELETE | Delete scan |
| `/api/settings` | GET | Get all settings |
| `/api/settings/{key}` | PUT | Update setting |
| `/api/settings/health` | GET | Health check |
**Authentication:** All endpoints (except `/api/settings/health`) require session authentication via `/auth/login`.
---
## Deployment
### Production Deployment
See [DEPLOYMENT.md](docs/ai/DEPLOYMENT.md) for comprehensive deployment guide.
**Quick Steps:**
1. **Configure environment variables:**
```bash
cp .env.example .env
# Edit .env and set secure keys
```
2. **Initialize database:**
```bash
docker-compose -f docker-compose-web.yml run --rm web python3 init_db.py
```
3. **Start services:**
```bash
docker-compose -f docker-compose-web.yml up -d
```
4. **Verify health:**
```bash
curl http://localhost:5000/api/settings/health
```
### Docker Volumes
The web application uses persistent volumes:
| Volume | Path | Description |
|--------|------|-------------|
| `data` | `/app/data` | SQLite database |
| `output` | `/app/output` | Scan results (JSON, HTML, ZIP, screenshots) |
| `logs` | `/app/logs` | Application logs |
| `configs` | `/app/configs` | YAML scan configurations |
**Backup:**
```bash
# Backup database
docker cp sneakyscanner_web:/app/data/sneakyscanner.db ./backup/
# Backup all scan results
docker cp sneakyscanner_web:/app/output ./backup/
# Or use docker-compose volumes
docker run --rm -v sneakyscanner_data:/data -v $(pwd)/backup:/backup alpine tar czf /backup/data.tar.gz /data
```
### Environment Variables
See `.env.example` for complete configuration options:
**Flask Configuration:**
- `FLASK_ENV` - Environment mode (production/development)
- `FLASK_DEBUG` - Debug mode (true/false)
- `SECRET_KEY` - Flask secret key for sessions (generate with `secrets.token_hex(32)`)
**Database:**
- `DATABASE_URL` - Database connection string (default: SQLite)
**Security:**
- `SNEAKYSCANNER_ENCRYPTION_KEY` - Encryption key for sensitive settings (generate with `secrets.token_urlsafe(32)`)
**Scheduler:**
- `SCHEDULER_EXECUTORS` - Number of concurrent scan workers (default: 2)
- `SCHEDULER_JOB_DEFAULTS_MAX_INSTANCES` - Max concurrent jobs (default: 3)
---
## Development
### Project Structure
```
SneakyScanner/
├── src/ # Scanner engine (CLI)
│ ├── scanner.py # Main scanner application
│ ├── screenshot_capture.py # Webpage screenshot capture
│ └── report_generator.py # HTML report generation
├── web/ # Web application (Flask)
│ ├── app.py # Flask app factory
│ ├── models.py # SQLAlchemy models (11 tables)
│ ├── api/ # API blueprints
│ │ ├── scans.py # Scan management endpoints
│ │ ├── settings.py # Settings endpoints
│ │ └── ...
│ ├── auth/ # Authentication
│ │ ├── routes.py # Login/logout routes
│ │ ├── decorators.py # Auth decorators
│ │ └── models.py # User model
│ ├── routes/ # Web UI routes
│ │ └── main.py # Dashboard, scans pages
│ ├── services/ # Business logic
│ │ ├── scan_service.py # Scan CRUD operations
│ │ └── scheduler_service.py # APScheduler integration
│ ├── jobs/ # Background jobs
│ │ └── scan_job.py # Async scan execution
│ ├── utils/ # Utilities
│ │ ├── settings.py # Settings manager
│ │ ├── pagination.py # Pagination helper
│ │ └── validators.py # Input validation
│ ├── templates/ # Jinja2 templates
│ │ ├── base.html # Base layout
│ │ ├── login.html # Login page
│ │ ├── dashboard.html # Dashboard
│ │ └── errors/ # Error templates
│ └── static/ # Static assets
│ ├── css/
│ ├── js/
│ └── images/
├── templates/ # Report templates (CLI)
│ └── report_template.html # HTML report template
├── tests/ # Test suite
│ ├── conftest.py # Pytest fixtures
│ ├── test_scan_service.py # Service tests
│ ├── test_scan_api.py # API tests
│ ├── test_authentication.py # Auth tests
│ ├── test_background_jobs.py # Scheduler tests
│ └── test_error_handling.py # Error handling tests
├── migrations/ # Alembic database migrations
│ └── versions/
│ ├── 001_initial_schema.py
│ ├── 002_add_scan_indexes.py
│ └── 003_add_scan_timing_fields.py
├── configs/ # Scan configurations
│ └── example-site.yaml
├── output/ # Scan results
├── docs/ # Documentation
│ ├── ai/ # Development docs
│ │ ├── API_REFERENCE.md
│ │ ├── DEPLOYMENT.md
│ │ ├── PHASE2.md
│ │ ├── PHASE2_COMPLETE.md
│ │ └── ROADMAP.md
│ └── human/
├── Dockerfile # Scanner + web app image
├── docker-compose.yml # CLI scanner compose
├── docker-compose-web.yml # Web app compose
├── requirements.txt # Scanner dependencies
├── requirements-web.txt # Web app dependencies
├── alembic.ini # Alembic configuration
├── init_db.py # Database initialization
├── .env.example # Environment template
├── CLAUDE.md # Developer guide
└── README.md # This file
```
### Running Tests
**In Docker:**
```bash
docker-compose -f docker-compose-web.yml run --rm web pytest tests/ -v
```
**Locally (requires Python 3.12+):**
```bash
pip install -r requirements-web.txt
pytest tests/ -v
# With coverage
pytest tests/ --cov=web --cov-report=html
```
**Test Coverage:**
- 100 test functions across 6 test files
- 1,825 lines of test code
- Coverage: Service layer, API endpoints, authentication, error handling, background jobs
### Database Migrations
**Create new migration:**
```bash
docker-compose -f docker-compose-web.yml run --rm web alembic revision --autogenerate -m "Description"
```
**Apply migrations:**
```bash
docker-compose -f docker-compose-web.yml run --rm web alembic upgrade head
```
**Rollback:**
```bash
docker-compose -f docker-compose-web.yml run --rm web alembic downgrade -1
```
## Security Notice
This tool requires:
- `--privileged` flag or `CAP_NET_RAW` capability for masscan and nmap raw socket access
⚠️ **Important**: This tool requires:
- `--privileged` flag or `CAP_NET_RAW` capability for raw socket access (masscan/nmap)
- `--network host` for direct network access
Only use this tool on networks you own or have explicit authorization to scan. Unauthorized network scanning may be illegal in your jurisdiction.
**Only use this tool on networks you own or have explicit authorization to scan.** Unauthorized network scanning may be illegal in your jurisdiction.
---
### Security Best Practices
## Roadmap
1. Run on dedicated scan server (not production systems)
2. Restrict network access with firewall rules
3. Use strong passwords and encryption keys
4. Enable HTTPS in production (reverse proxy recommended)
5. Regularly update Docker images and dependencies
**Current Phase:** Phase 2 Complete ✅
**Completed Phases:**
-**Phase 1** - Database foundation, Flask app structure, settings system
-**Phase 2** - REST API, background jobs, authentication, basic UI
**Upcoming Phases:**
- 📋 **Phase 3** - Enhanced dashboard, trend charts, scheduled scans (Weeks 5-6)
- 📋 **Phase 4** - Email notifications, scan comparison, alert rules (Weeks 7-8)
- 📋 **Phase 5** - CLI as API client, token authentication (Week 9)
- 📋 **Phase 6** - Advanced features (vulnerability detection, PDF export, timeline view)
See [ROADMAP.md](docs/ai/ROADMAP.md) for detailed feature planning.
See [Deployment Guide](docs/DEPLOYMENT.md) for production security checklist.
---
## Contributing
This is a personal/small team project. For bugs or feature requests:
1. Check existing issues
2. Create detailed bug reports with reproduction steps
3. Submit pull requests with tests
@@ -800,27 +182,17 @@ MIT License - See LICENSE file for details
---
## Security Notice
This tool requires:
- `--privileged` flag or `CAP_NET_RAW` capability for masscan and nmap raw socket access
- `--network host` for direct network access
**⚠️ Important:** Only use this tool on networks you own or have explicit authorization to scan. Unauthorized network scanning may be illegal in your jurisdiction.
---
## Support
**Documentation:**
- [API Reference](docs/ai/API_REFERENCE.md)
- [Deployment Guide](docs/ai/DEPLOYMENT.md)
- [Developer Guide](CLAUDE.md)
- [Roadmap](docs/ai/ROADMAP.md)
**Documentation**:
- [Deployment Guide](docs/DEPLOYMENT.md)
- [CLI Scanning Guide](docs/CLI_SCANNING.md)
- [API Reference](docs/API_REFERENCE.md)
- [Roadmap](docs/ROADMAP.md)
**Issues:** https://github.com/anthropics/sneakyscanner/issues
**Issues**: email me ptarrant at gmail dot com
---
**Version:** 2.0 (Phase 2 Complete)
**Last Updated:** 2025-11-14
**Version**: Phase 4 Complete
**Last Updated**: 2025-11-17

View File

@@ -0,0 +1,13 @@
version: '3.8'
services:
scanner:
build: .
image: sneakyscanner:latest
container_name: sneakyscanner
privileged: true # Required for masscan raw socket access
network_mode: host # Required for network scanning
volumes:
- ./configs:/app/configs:ro
- ./output:/app/output
command: /app/configs/example-site.yaml

View File

@@ -1,64 +0,0 @@
version: '3.8'
services:
web:
build: .
image: sneakyscanner:latest
container_name: sneakyscanner-web
# Override entrypoint to run Flask app instead of scanner
entrypoint: ["python3", "-u"]
command: ["-m", "web.app"]
# Note: Using host network mode for scanner capabilities, so no port mapping needed
# The Flask app will be accessible at http://localhost:5000
volumes:
# Mount configs directory for scan configurations (read-write for web UI management)
- ./configs:/app/configs
# Mount output directory for scan results
- ./output:/app/output
# Mount database file for persistence
- ./data:/app/data
# Mount logs directory
- ./logs:/app/logs
environment:
# Flask configuration
- FLASK_APP=web.app
- FLASK_ENV=${FLASK_ENV:-production}
- FLASK_DEBUG=${FLASK_DEBUG:-false}
- FLASK_HOST=0.0.0.0
- FLASK_PORT=5000
# Database configuration (SQLite in mounted volume for persistence)
- DATABASE_URL=sqlite:////app/data/sneakyscanner.db
# Security settings
- SECRET_KEY=${SECRET_KEY:-dev-secret-key-change-in-production}
- SNEAKYSCANNER_ENCRYPTION_KEY=${SNEAKYSCANNER_ENCRYPTION_KEY:-}
# Optional: CORS origins (comma-separated)
- CORS_ORIGINS=${CORS_ORIGINS:-*}
# Optional: Logging level
- LOG_LEVEL=${LOG_LEVEL:-INFO}
# Scheduler configuration (APScheduler)
- SCHEDULER_EXECUTORS=${SCHEDULER_EXECUTORS:-2}
- SCHEDULER_JOB_DEFAULTS_MAX_INSTANCES=${SCHEDULER_JOB_DEFAULTS_MAX_INSTANCES:-3}
# Scanner functionality requires privileged mode and host network for masscan/nmap
privileged: true
network_mode: host
# Health check to ensure web service is running
healthcheck:
test: ["CMD", "python3", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:5000/api/settings/health').read()"]
interval: 60s
timeout: 10s
retries: 3
start_period: 40s
restart: unless-stopped
# Optional: Initialize database on first run
# Run with: docker-compose -f docker-compose-web.yml run --rm init-db
init-db:
build: .
image: sneakyscanner:latest
container_name: sneakyscanner-init-db
entrypoint: ["python3"]
command: ["init_db.py", "--db-url", "sqlite:////app/data/sneakyscanner.db"]
volumes:
- ./data:/app/data
profiles:
- tools

View File

@@ -1,13 +1,64 @@
version: '3.8'
services:
scanner:
web:
build: .
image: sneakyscanner:latest
container_name: sneakyscanner
privileged: true # Required for masscan raw socket access
network_mode: host # Required for network scanning
container_name: sneakyscanner-web
# Override entrypoint to run Flask app instead of scanner
entrypoint: ["python3", "-u"]
command: ["-m", "web.app"]
# Note: Using host network mode for scanner capabilities, so no port mapping needed
# The Flask app will be accessible at http://localhost:5000
volumes:
- ./configs:/app/configs:ro
# Mount configs directory for scan configurations (read-write for web UI management)
- ./configs:/app/configs
# Mount output directory for scan results
- ./output:/app/output
command: /app/configs/example-site.yaml
# Mount database file for persistence
- ./data:/app/data
# Mount logs directory
- ./logs:/app/logs
environment:
# Flask configuration
- FLASK_APP=web.app
- FLASK_ENV=${FLASK_ENV:-production}
- FLASK_DEBUG=${FLASK_DEBUG:-false}
- FLASK_HOST=0.0.0.0
- FLASK_PORT=5000
# Database configuration (SQLite in mounted volume for persistence)
- DATABASE_URL=sqlite:////app/data/sneakyscanner.db
# Security settings
- SECRET_KEY=${SECRET_KEY:-dev-secret-key-change-in-production}
- SNEAKYSCANNER_ENCRYPTION_KEY=${SNEAKYSCANNER_ENCRYPTION_KEY:-}
# Optional: CORS origins (comma-separated)
- CORS_ORIGINS=${CORS_ORIGINS:-*}
# Optional: Logging level
- LOG_LEVEL=${LOG_LEVEL:-INFO}
# Scheduler configuration (APScheduler)
- SCHEDULER_EXECUTORS=${SCHEDULER_EXECUTORS:-2}
- SCHEDULER_JOB_DEFAULTS_MAX_INSTANCES=${SCHEDULER_JOB_DEFAULTS_MAX_INSTANCES:-3}
# Scanner functionality requires privileged mode and host network for masscan/nmap
privileged: true
network_mode: host
# Health check to ensure web service is running
healthcheck:
test: ["CMD", "python3", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:5000/api/settings/health').read()"]
interval: 60s
timeout: 10s
retries: 3
start_period: 40s
restart: unless-stopped
# Optional: Initialize database on first run
# Run with: docker-compose -f docker-compose-web.yml run --rm init-db
init-db:
build: .
image: sneakyscanner:latest
container_name: sneakyscanner-init-db
entrypoint: ["python3"]
command: ["init_db.py", "--db-url", "sqlite:////app/data/sneakyscanner.db"]
volumes:
- ./data:/app/data
profiles:
- tools

2023
docs/API_REFERENCE.md Normal file

File diff suppressed because it is too large Load Diff

502
docs/CLI_SCANNING.md Normal file
View File

@@ -0,0 +1,502 @@
# CLI Scanner Guide
The SneakyScanner CLI provides a standalone scanning tool for quick one-off scans, testing, or CI/CD pipelines without requiring the web application.
## Table of Contents
1. [Quick Start](#quick-start)
2. [Configuration](#configuration)
3. [Scan Performance](#scan-performance)
4. [Output Formats](#output-formats)
5. [Screenshot Capture](#screenshot-capture)
6. [HTML Reports](#html-reports)
7. [Advanced Usage](#advanced-usage)
---
## Quick Start
### Using Docker Compose (Recommended)
```bash
# Build the image
docker compose -f docker-compose-standalone.yml build
# Run a scan
docker compose -f docker-compose-standalone.yml up
# Results saved to ./output/ directory
```
### Using Docker Directly
```bash
# Build the image
docker build -t sneakyscanner .
# Run a scan
docker run --rm --privileged --network host \
-v $(pwd)/configs:/app/configs:ro \
-v $(pwd)/output:/app/output \
sneakyscanner /app/configs/your-config.yaml
```
### Requirements
- Docker
- Linux host (required for `--privileged` and `--network host`)
- Configuration file in `configs/` directory
---
## Configuration
The YAML configuration file defines scan parameters and expectations.
### Basic Configuration
```yaml
title: "My Infrastructure Scan"
sites:
- name: "Web Servers"
ips:
- address: "192.168.1.10"
expected:
ping: true
tcp_ports: [22, 80, 443]
udp_ports: []
services: ["ssh", "http", "https"]
```
### CIDR Range Configuration
```yaml
title: "Network Scan"
sites:
- name: "Production Network"
cidr: "192.168.1.0/24"
expected_ports:
- port: 22
protocol: tcp
service: "ssh"
- port: 80
protocol: tcp
service: "http"
- port: 443
protocol: tcp
service: "https"
ping_expected: true
```
### Configuration Reference
| Field | Type | Required | Description |
|-------|------|----------|-------------|
| `title` | string | Yes | Report title |
| `sites` | array | Yes | List of sites to scan |
| `sites[].name` | string | Yes | Site name |
| `sites[].ips` | array | Conditional | List of IP addresses (if not using CIDR) |
| `sites[].cidr` | string | Conditional | CIDR range (if not using IPs) |
| `sites[].ips[].address` | string | Yes | IP address |
| `sites[].ips[].expected.ping` | boolean | No | Expected ping response |
| `sites[].ips[].expected.tcp_ports` | array | No | Expected TCP ports |
| `sites[].ips[].expected.udp_ports` | array | No | Expected UDP ports |
| `sites[].ips[].expected.services` | array | No | Expected service names |
| `sites[].expected_ports` | array | No | Expected ports for CIDR range |
| `sites[].ping_expected` | boolean | No | Expected ping for CIDR range |
See `configs/example-site.yaml` for a complete example.
---
## Scan Performance
SneakyScanner uses a five-phase approach for comprehensive scanning:
1. **Ping Scan** (masscan): ICMP echo detection
- Duration: ~1-2 seconds
- Tests network reachability
2. **TCP Port Discovery** (masscan): Scans all 65535 TCP ports
- Rate: 10,000 packets/second
- Duration: ~13 seconds per 2 IPs
3. **UDP Port Discovery** (masscan): Scans all 65535 UDP ports
- Rate: 10,000 packets/second
- Duration: ~13 seconds per 2 IPs
4. **Service Detection** (nmap): Identifies services on discovered TCP ports
- Intensity level: 5 (balanced)
- Duration: ~20-60 seconds per IP with open ports
- Extracts product names and versions
5. **HTTP/HTTPS Analysis**: Web protocol detection and SSL/TLS analysis
- Screenshot capture (Playwright)
- Certificate extraction (sslyze)
- TLS version testing
- Duration: ~10-20 seconds per web service
**Example**: Scanning 2 IPs with 10 open ports each (including 2-3 web services) typically takes 2-3 minutes total.
### Performance Tuning
Adjust scan rate in the scanner code if needed:
- Default: 10,000 pps (packets per second)
- Increase for faster scans (may cause network congestion)
- Decrease for slower, more reliable scans
---
## Output Formats
After each scan completes, SneakyScanner automatically generates three output formats:
### 1. JSON Report
**Filename**: `scan_report_YYYYMMDD_HHMMSS.json`
Machine-readable scan data with all discovered services, ports, and SSL/TLS information.
**Structure**:
```json
{
"title": "Scan Title",
"scan_time": "2025-01-15T10:30:00Z",
"scan_duration": 95.3,
"config_file": "/app/configs/example-site.yaml",
"sites": [
{
"name": "Site Name",
"ips": [
{
"address": "192.168.1.10",
"expected": {...},
"actual": {
"ping": true,
"tcp_ports": [22, 80, 443],
"udp_ports": [],
"services": [...]
}
}
]
}
]
}
```
### 2. HTML Report
**Filename**: `scan_report_YYYYMMDD_HHMMSS.html`
Human-readable report with dark theme, summary dashboard, and detailed service breakdown.
Features:
- Summary statistics
- Drift alerts (unexpected ports/services)
- Security warnings (expiring certs, weak TLS)
- Site-by-site breakdown
- Expandable service details
- SSL/TLS certificate information
See [HTML Reports](#html-reports) section for details.
### 3. ZIP Archive
**Filename**: `scan_report_YYYYMMDD_HHMMSS.zip`
Contains:
- JSON report
- HTML report
- All screenshots (if web services were found)
Useful for:
- Easy sharing
- Archival
- Compliance documentation
### 4. Screenshots Directory
**Directory**: `scan_report_YYYYMMDD_HHMMSS_screenshots/`
PNG screenshots of all discovered web services:
- Filename format: `{ip}_{port}.png` (e.g., `192_168_1_10_443.png`)
- Viewport size: 1280x720
- Referenced in JSON report under `http_info.screenshot`
---
## Screenshot Capture
SneakyScanner automatically captures webpage screenshots for all discovered HTTP and HTTPS services.
### Automatic Detection
Screenshots are captured for services identified as web services based on:
- Nmap service detection results (http, https, ssl, http-proxy)
- Common web ports (80, 443, 8000, 8006, 8080, 8081, 8443, 8888, 9443)
### Capture Process
For each web service:
1. **Launch Browser**: Headless Chromium (once per scan, reused)
2. **Navigate**: To service URL (HTTP or HTTPS)
3. **Wait**: For network to be idle (up to 15 seconds)
4. **Capture**: Viewport screenshot (1280x720 pixels)
5. **Save**: As PNG file in screenshots directory
### Configuration
Default settings (configured in `src/screenshot_capture.py`):
| Setting | Value |
|---------|-------|
| Viewport size | 1280x720 |
| Timeout | 15 seconds |
| Browser | Chromium (headless) |
| SSL handling | Ignores HTTPS errors |
| User agent | Mozilla/5.0 (Windows NT 10.0; Win64; x64) |
### Error Handling
Screenshots are captured on a best-effort basis:
- Failed screenshots are logged but don't stop the scan
- Services without screenshots omit the `screenshot` field in JSON
- Common errors: timeout, connection refused, invalid SSL
### Disabling Screenshots
To disable screenshot capture, modify `src/screenshot_capture.py` or comment out the screenshot phase in `src/scanner.py`.
---
## HTML Reports
SneakyScanner automatically generates comprehensive HTML reports after each scan.
### Automatic Generation
HTML reports are created after every scan, along with JSON reports and ZIP archives. All outputs share the same timestamp.
### Manual Generation
Generate HTML reports from existing JSON scan data:
```bash
# Generate HTML report (creates report in same directory as JSON)
cd app/
python3 src/report_generator.py ../output/scan_report_20250115_103000.json
# Specify custom output path
python3 src/report_generator.py ../output/scan_report.json /path/to/custom_report.html
```
### Report Features
**Summary Dashboard**:
- **Scan Statistics**: Total IPs, TCP/UDP ports, services, web services, screenshots
- **Drift Alerts**: Unexpected ports, missing services, new services
- **Security Warnings**: Expiring certificates (<30 days), weak TLS (1.0/1.1), self-signed certs, high ports (>10000)
**Site-by-Site Breakdown**:
- Organized by logical site grouping from configuration
- Per-IP sections with status badges (ping, port drift)
- Service tables with expandable details (click to expand)
- Visual badges: green (expected), red (unexpected), yellow (missing/warning)
**Service Details** (expandable):
- Product name, version, extra information, OS type
- HTTP/HTTPS protocol detection
- Screenshot links for web services
- SSL/TLS certificate details:
- Subject, issuer, validity dates, serial number
- Days until expiration (color-coded warnings)
- Subject Alternative Names (SANs)
- TLS version support (1.0, 1.1, 1.2, 1.3) with cipher suites
- Weak TLS and self-signed certificate warnings
**UDP Port Handling**:
- Expected UDP ports: green "Expected" badge
- Unexpected UDP ports: red "Unexpected" badge
- Missing UDP ports: yellow "Missing" badge
- Note: Service detection not available for UDP (nmap limitation)
**Design**:
- Dark theme with slate/grey color scheme
- Responsive layout
- No external dependencies (single HTML file)
- Minimal JavaScript for expand/collapse
- Optimized hover effects
### Report Portability
The HTML report is a standalone file that can be:
- Opened in any web browser (Chrome, Firefox, Safari, Edge)
- Shared via email or file transfer
- Archived for compliance or historical comparison
- Viewed without internet connection
**Note**: Screenshot links use relative paths, so keep the report and screenshot directory together.
---
## Advanced Usage
### Running on Remote Targets
```bash
# Scan remote network via Docker host
docker run --rm --privileged --network host \
-v $(pwd)/configs:/app/configs:ro \
-v $(pwd)/output:/app/output \
sneakyscanner /app/configs/remote-network.yaml
```
**Note**: The Docker host must have network access to the target network.
### CI/CD Integration
```yaml
# Example GitLab CI pipeline
scan-infrastructure:
stage: test
image: docker:latest
services:
- docker:dind
script:
- docker build -t sneakyscanner .
- docker run --rm --privileged --network host \
-v $PWD/configs:/app/configs:ro \
-v $PWD/output:/app/output \
sneakyscanner /app/configs/production.yaml
artifacts:
paths:
- output/
expire_in: 30 days
```
### Batch Scanning
```bash
# Scan multiple configs sequentially
for config in configs/*.yaml; do
docker run --rm --privileged --network host \
-v $(pwd)/configs:/app/configs:ro \
-v $(pwd)/output:/app/output \
sneakyscanner "/app/configs/$(basename $config)"
done
```
### Custom Output Directory
```bash
# Use custom output directory
mkdir -p /path/to/custom/output
docker run --rm --privileged --network host \
-v $(pwd)/configs:/app/configs:ro \
-v /path/to/custom/output:/app/output \
sneakyscanner /app/configs/config.yaml
```
---
## Troubleshooting
### Permission Denied Errors
**Problem**: masscan or nmap fails with permission denied
**Solution**: Ensure Docker is running with `--privileged` flag:
```bash
docker run --rm --privileged --network host ...
```
### No Ports Found
**Problem**: Scan completes but finds no open ports
**Possible Causes**:
- Firewall blocking scans
- Wrong network (ensure `--network host`)
- Target hosts are down
- Incorrect IP addresses in config
**Debug**:
```bash
# Test ping manually
ping 192.168.1.10
# Check Docker network mode
docker inspect <container-id> | grep NetworkMode
```
### Screenshots Failing
**Problem**: Screenshots not being captured
**Possible Causes**:
- Chromium not installed (check Dockerfile)
- Timeout too short (increase in screenshot_capture.py)
- Web service requires authentication
- SSL certificate errors
**Debug**: Check scan logs for screenshot errors
### Scan Takes Too Long
**Problem**: Scan runs for 30+ minutes
**Solutions**:
- Reduce scan rate (edit scanner.py)
- Limit port range (edit scanner.py to scan specific ports)
- Reduce number of IPs in config
- Disable UDP scanning if not needed
---
## Security Considerations
### Privileged Mode
The CLI scanner requires `--privileged` flag for:
- Raw socket access (masscan, nmap)
- ICMP echo requests (ping)
**Security implications**:
- Container has extensive host capabilities
- Only run on trusted networks
- Don't expose to public networks
### Network Mode: Host
The scanner uses `--network host` for:
- Direct network access without NAT
- Raw packet sending
- Accurate service detection
**Security implications**:
- Container shares host network namespace
- Can access all host network interfaces
- Bypass Docker network isolation
### Best Practices
1. **Only scan authorized networks**
2. **Run on dedicated scan server** (not production)
3. **Limit network access** with firewall rules
4. **Review scan configs** before running
5. **Store results securely** (may contain sensitive data)
---
## Support
- **Deployment Guide**: [docs/DEPLOYMENT.md](DEPLOYMENT.md)
- **API Reference**: [docs/API_REFERENCE.md](API_REFERENCE.md)
- **Roadmap**: [docs/ROADMAP.md](ROADMAP.md)
---
**Last Updated**: 2025-11-17
**Version**: Phase 4 Complete

1048
docs/DEPLOYMENT.md Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,766 +0,0 @@
# SneakyScanner Web API Reference
**Version:** 2.0 (Phase 2)
**Base URL:** `http://localhost:5000`
**Authentication:** Session-based (Flask-Login)
## Table of Contents
1. [Authentication](#authentication)
2. [Scans API](#scans-api)
3. [Settings API](#settings-api)
4. [Error Handling](#error-handling)
5. [Status Codes](#status-codes)
6. [Request/Response Examples](#request-response-examples)
---
## Authentication
SneakyScanner uses session-based authentication with Flask-Login. All API endpoints (except login) require authentication.
### Login
Authenticate and create a session.
**Endpoint:** `POST /auth/login`
**Request Body:**
```json
{
"password": "your-password-here"
}
```
**Success Response (200 OK):**
```json
{
"message": "Login successful",
"redirect": "/dashboard"
}
```
**Error Response (401 Unauthorized):**
```json
{
"error": "Invalid password"
}
```
**Usage Example:**
```bash
# Login and save session cookie
curl -X POST http://localhost:5000/auth/login \
-H "Content-Type: application/json" \
-d '{"password":"yourpassword"}' \
-c cookies.txt
# Use session cookie for subsequent requests
curl -X GET http://localhost:5000/api/scans \
-b cookies.txt
```
### Logout
Destroy the current session.
**Endpoint:** `GET /auth/logout`
**Success Response:** Redirects to login page (302)
---
## Scans API
Manage network scans: trigger, list, view, and delete.
### Trigger Scan
Start a new background scan.
**Endpoint:** `POST /api/scans`
**Authentication:** Required
**Request Body:**
```json
{
"config_file": "/app/configs/example-site.yaml"
}
```
**Success Response (201 Created):**
```json
{
"scan_id": 42,
"status": "running",
"message": "Scan queued successfully"
}
```
**Error Responses:**
*400 Bad Request* - Invalid config file:
```json
{
"error": "Invalid config file",
"message": "Config file does not exist or is not valid YAML"
}
```
*500 Internal Server Error* - Scan queue failure:
```json
{
"error": "Failed to queue scan",
"message": "Internal server error"
}
```
**Usage Example:**
```bash
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/production.yaml"}' \
-b cookies.txt
```
### List Scans
Retrieve a paginated list of scans with optional status filtering.
**Endpoint:** `GET /api/scans`
**Authentication:** Required
**Query Parameters:**
| Parameter | Type | Required | Default | Description |
|-----------|------|----------|---------|-------------|
| `page` | integer | No | 1 | Page number (1-indexed) |
| `per_page` | integer | No | 20 | Items per page (1-100) |
| `status` | string | No | - | Filter by status: `running`, `completed`, `failed` |
**Success Response (200 OK):**
```json
{
"scans": [
{
"id": 42,
"timestamp": "2025-11-14T10:30:00Z",
"duration": 125.5,
"status": "completed",
"title": "Production Network Scan",
"config_file": "/app/configs/production.yaml",
"triggered_by": "manual",
"started_at": "2025-11-14T10:30:00Z",
"completed_at": "2025-11-14T10:32:05Z"
},
{
"id": 41,
"timestamp": "2025-11-13T15:00:00Z",
"duration": 98.2,
"status": "completed",
"title": "Development Network Scan",
"config_file": "/app/configs/dev.yaml",
"triggered_by": "scheduled",
"started_at": "2025-11-13T15:00:00Z",
"completed_at": "2025-11-13T15:01:38Z"
}
],
"total": 42,
"page": 1,
"per_page": 20,
"pages": 3
}
```
**Error Responses:**
*400 Bad Request* - Invalid parameters:
```json
{
"error": "Invalid pagination parameters",
"message": "Page and per_page must be positive integers"
}
```
**Usage Examples:**
```bash
# List first page (default 20 items)
curl -X GET http://localhost:5000/api/scans \
-b cookies.txt
# List page 2 with 50 items per page
curl -X GET "http://localhost:5000/api/scans?page=2&per_page=50" \
-b cookies.txt
# List only running scans
curl -X GET "http://localhost:5000/api/scans?status=running" \
-b cookies.txt
```
### Get Scan Details
Retrieve complete details for a specific scan, including all sites, IPs, ports, services, certificates, and TLS versions.
**Endpoint:** `GET /api/scans/{id}`
**Authentication:** Required
**Path Parameters:**
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | integer | Yes | Scan ID |
**Success Response (200 OK):**
```json
{
"id": 42,
"timestamp": "2025-11-14T10:30:00Z",
"duration": 125.5,
"status": "completed",
"title": "Production Network Scan",
"config_file": "/app/configs/production.yaml",
"json_path": "/app/output/scan_report_20251114_103000.json",
"html_path": "/app/output/scan_report_20251114_103000.html",
"zip_path": "/app/output/scan_report_20251114_103000.zip",
"screenshot_dir": "/app/output/scan_report_20251114_103000_screenshots",
"triggered_by": "manual",
"started_at": "2025-11-14T10:30:00Z",
"completed_at": "2025-11-14T10:32:05Z",
"error_message": null,
"sites": [
{
"id": 101,
"site_name": "Production Web Servers",
"ips": [
{
"id": 201,
"ip_address": "192.168.1.10",
"ping_expected": true,
"ping_actual": true,
"ports": [
{
"id": 301,
"port": 443,
"protocol": "tcp",
"expected": true,
"state": "open",
"services": [
{
"id": 401,
"service_name": "https",
"product": "nginx",
"version": "1.24.0",
"extrainfo": null,
"ostype": "Linux",
"http_protocol": "https",
"screenshot_path": "scan_report_20251114_103000_screenshots/192_168_1_10_443.png",
"certificates": [
{
"id": 501,
"subject": "CN=example.com",
"issuer": "CN=Let's Encrypt Authority X3,O=Let's Encrypt,C=US",
"serial_number": "123456789012345678901234567890",
"not_valid_before": "2025-01-01T00:00:00+00:00",
"not_valid_after": "2025-04-01T23:59:59+00:00",
"days_until_expiry": 89,
"sans": "[\"example.com\", \"www.example.com\"]",
"is_self_signed": false,
"tls_versions": [
{
"id": 601,
"tls_version": "TLS 1.2",
"supported": true,
"cipher_suites": "[\"TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384\", \"TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256\"]"
},
{
"id": 602,
"tls_version": "TLS 1.3",
"supported": true,
"cipher_suites": "[\"TLS_AES_256_GCM_SHA384\", \"TLS_AES_128_GCM_SHA256\"]"
}
]
}
]
}
]
}
]
}
]
}
]
}
```
**Error Responses:**
*404 Not Found* - Scan doesn't exist:
```json
{
"error": "Scan not found",
"message": "Scan with ID 42 does not exist"
}
```
**Usage Example:**
```bash
curl -X GET http://localhost:5000/api/scans/42 \
-b cookies.txt
```
### Get Scan Status
Poll the current status of a running scan. Use this endpoint to track scan progress.
**Endpoint:** `GET /api/scans/{id}/status`
**Authentication:** Required
**Path Parameters:**
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | integer | Yes | Scan ID |
**Success Response (200 OK):**
*Running scan:*
```json
{
"scan_id": 42,
"status": "running",
"started_at": "2025-11-14T10:30:00Z",
"completed_at": null,
"error_message": null
}
```
*Completed scan:*
```json
{
"scan_id": 42,
"status": "completed",
"started_at": "2025-11-14T10:30:00Z",
"completed_at": "2025-11-14T10:32:05Z",
"error_message": null
}
```
*Failed scan:*
```json
{
"scan_id": 42,
"status": "failed",
"started_at": "2025-11-14T10:30:00Z",
"completed_at": "2025-11-14T10:30:15Z",
"error_message": "Config file not found: /app/configs/missing.yaml"
}
```
**Error Responses:**
*404 Not Found* - Scan doesn't exist:
```json
{
"error": "Scan not found",
"message": "Scan with ID 42 does not exist"
}
```
**Usage Example:**
```bash
# Poll status every 5 seconds
while true; do
curl -X GET http://localhost:5000/api/scans/42/status -b cookies.txt
sleep 5
done
```
### Delete Scan
Delete a scan and all associated files (JSON, HTML, ZIP, screenshots).
**Endpoint:** `DELETE /api/scans/{id}`
**Authentication:** Required
**Path Parameters:**
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `id` | integer | Yes | Scan ID |
**Success Response (200 OK):**
```json
{
"message": "Scan 42 deleted successfully"
}
```
**Error Responses:**
*404 Not Found* - Scan doesn't exist:
```json
{
"error": "Scan not found",
"message": "Scan with ID 42 does not exist"
}
```
**Usage Example:**
```bash
curl -X DELETE http://localhost:5000/api/scans/42 \
-b cookies.txt
```
---
## Settings API
Manage application settings including SMTP configuration, encryption keys, and preferences.
### Get All Settings
Retrieve all application settings. Sensitive values (passwords, keys) are masked.
**Endpoint:** `GET /api/settings`
**Authentication:** Required
**Success Response (200 OK):**
```json
{
"smtp_server": "smtp.gmail.com",
"smtp_port": 587,
"smtp_username": "alerts@example.com",
"smtp_password": "********",
"smtp_from_email": "alerts@example.com",
"smtp_to_emails": "[\"admin@example.com\"]",
"retention_days": 90,
"app_password": "********"
}
```
**Usage Example:**
```bash
curl -X GET http://localhost:5000/api/settings \
-b cookies.txt
```
### Update Setting
Update a specific setting value.
**Endpoint:** `PUT /api/settings/{key}`
**Authentication:** Required
**Path Parameters:**
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `key` | string | Yes | Setting key (e.g., `smtp_server`) |
**Request Body:**
```json
{
"value": "smtp.example.com"
}
```
**Success Response (200 OK):**
```json
{
"message": "Setting updated successfully",
"key": "smtp_server",
"value": "smtp.example.com"
}
```
**Error Responses:**
*400 Bad Request* - Missing value:
```json
{
"error": "Missing required field: value"
}
```
**Usage Example:**
```bash
curl -X PUT http://localhost:5000/api/settings/smtp_server \
-H "Content-Type: application/json" \
-d '{"value":"smtp.example.com"}' \
-b cookies.txt
```
### Health Check
Check if the API is running and database is accessible.
**Endpoint:** `GET /api/settings/health`
**Authentication:** Not required
**Success Response (200 OK):**
```json
{
"status": "healthy",
"database": "connected"
}
```
**Error Response (500 Internal Server Error):**
```json
{
"status": "unhealthy",
"database": "disconnected",
"error": "Connection error details"
}
```
**Usage Example:**
```bash
curl -X GET http://localhost:5000/api/settings/health
```
---
## Error Handling
### Error Response Format
All error responses follow a consistent JSON format:
```json
{
"error": "Brief error type",
"message": "Detailed error message for debugging"
}
```
### Content Negotiation
The API supports content negotiation based on the request:
- **API Requests** (Accept: application/json or /api/* path): Returns JSON errors
- **Web Requests** (Accept: text/html): Returns HTML error pages
**Example:**
```bash
# JSON error response
curl -X GET http://localhost:5000/api/scans/999 \
-H "Accept: application/json" \
-b cookies.txt
# HTML error page
curl -X GET http://localhost:5000/scans/999 \
-H "Accept: text/html" \
-b cookies.txt
```
### Request ID Tracking
Every request receives a unique request ID for tracking and debugging:
**Response Headers:**
```
X-Request-ID: a1b2c3d4
X-Request-Duration-Ms: 125
```
Check application logs for detailed error information using the request ID:
```
2025-11-14 10:30:15 INFO [a1b2c3d4] GET /api/scans 200 125ms
```
---
## Status Codes
### Success Codes
| Code | Meaning | Usage |
|------|---------|-------|
| 200 | OK | Successful GET, PUT, DELETE requests |
| 201 | Created | Successful POST request that creates a resource |
| 302 | Found | Redirect (used for logout, login success) |
### Client Error Codes
| Code | Meaning | Usage |
|------|---------|-------|
| 400 | Bad Request | Invalid request parameters or body |
| 401 | Unauthorized | Authentication required or failed |
| 403 | Forbidden | Authenticated but not authorized |
| 404 | Not Found | Resource doesn't exist |
| 405 | Method Not Allowed | HTTP method not supported for endpoint |
### Server Error Codes
| Code | Meaning | Usage |
|------|---------|-------|
| 500 | Internal Server Error | Unexpected server error |
---
## Request/Response Examples
### Complete Workflow: Trigger and Monitor Scan
```bash
#!/bin/bash
# 1. Login and save session
curl -X POST http://localhost:5000/auth/login \
-H "Content-Type: application/json" \
-d '{"password":"yourpassword"}' \
-c cookies.txt
# 2. Trigger a new scan
RESPONSE=$(curl -s -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/production.yaml"}' \
-b cookies.txt)
# Extract scan ID from response
SCAN_ID=$(echo $RESPONSE | jq -r '.scan_id')
echo "Scan ID: $SCAN_ID"
# 3. Poll status every 5 seconds until complete
while true; do
STATUS=$(curl -s -X GET http://localhost:5000/api/scans/$SCAN_ID/status \
-b cookies.txt | jq -r '.status')
echo "Status: $STATUS"
if [ "$STATUS" == "completed" ] || [ "$STATUS" == "failed" ]; then
break
fi
sleep 5
done
# 4. Get full scan results
curl -X GET http://localhost:5000/api/scans/$SCAN_ID \
-b cookies.txt | jq '.'
# 5. Logout
curl -X GET http://localhost:5000/auth/logout \
-b cookies.txt
```
### Pagination Example
```bash
#!/bin/bash
# Get total number of scans
TOTAL=$(curl -s -X GET "http://localhost:5000/api/scans?per_page=1" \
-b cookies.txt | jq -r '.total')
echo "Total scans: $TOTAL"
# Calculate number of pages (50 items per page)
PER_PAGE=50
PAGES=$(( ($TOTAL + $PER_PAGE - 1) / $PER_PAGE ))
echo "Total pages: $PAGES"
# Fetch all pages
for PAGE in $(seq 1 $PAGES); do
echo "Fetching page $PAGE..."
curl -s -X GET "http://localhost:5000/api/scans?page=$PAGE&per_page=$PER_PAGE" \
-b cookies.txt | jq '.scans[] | {id, timestamp, title, status}'
done
```
### Filter by Status
```bash
# Get all running scans
curl -X GET "http://localhost:5000/api/scans?status=running" \
-b cookies.txt | jq '.scans[] | {id, title, started_at}'
# Get all failed scans
curl -X GET "http://localhost:5000/api/scans?status=failed" \
-b cookies.txt | jq '.scans[] | {id, title, error_message}'
# Get all completed scans from last 24 hours
curl -X GET "http://localhost:5000/api/scans?status=completed" \
-b cookies.txt | jq '.scans[] | select(.completed_at > (now - 86400 | todate)) | {id, title, duration}'
```
---
## Rate Limiting
Currently no rate limiting is implemented. For production deployments, consider:
- Adding nginx rate limiting
- Implementing application-level rate limiting with Flask-Limiter
- Setting connection limits in Gunicorn configuration
---
## Security Considerations
### Authentication
- All API endpoints (except `/auth/login` and `/api/settings/health`) require authentication
- Session cookies are httpOnly and secure (in production with HTTPS)
- Passwords are hashed with bcrypt (cost factor 12)
- Sensitive settings values are encrypted at rest
### CORS
CORS is configured via environment variable `CORS_ORIGINS`. Default: `*` (allow all).
For production, set to specific origins:
```bash
CORS_ORIGINS=https://scanner.example.com,https://admin.example.com
```
### HTTPS
For production deployments:
1. Use HTTPS (TLS/SSL) for all requests
2. Set `SESSION_COOKIE_SECURE=True` in Flask config
3. Consider using a reverse proxy (nginx) with SSL termination
### Input Validation
All inputs are validated:
- Config file paths are checked for existence and valid YAML
- Pagination parameters are sanitized (positive integers, max per_page: 100)
- Scan IDs are validated as integers
- Setting values are type-checked
---
## Versioning
**Current Version:** 2.0 (Phase 2)
API versioning will be implemented in Phase 5. For now, the API is considered unstable and may change between phases.
**Breaking Changes:**
- Phase 2 → Phase 3: Possible schema changes for scan comparison
- Phase 3 → Phase 4: Possible authentication changes (token auth)
---
## Support
For issues, questions, or feature requests:
- GitHub Issues: https://github.com/anthropics/sneakyscanner/issues
- Documentation: `/docs/ai/` directory in repository
---
**Last Updated:** 2025-11-14
**Phase:** 2 - Flask Web App Core
**Next Update:** Phase 3 - Dashboard & Scheduling

View File

@@ -1,666 +0,0 @@
# SneakyScanner Deployment Guide
## Table of Contents
1. [Overview](#overview)
2. [Prerequisites](#prerequisites)
3. [Quick Start](#quick-start)
4. [Configuration](#configuration)
5. [First-Time Setup](#first-time-setup)
6. [Running the Application](#running-the-application)
7. [Volume Management](#volume-management)
8. [Health Monitoring](#health-monitoring)
9. [Troubleshooting](#troubleshooting)
10. [Security Considerations](#security-considerations)
11. [Upgrading](#upgrading)
12. [Backup and Restore](#backup-and-restore)
---
## Overview
SneakyScanner is deployed as a Docker container running a Flask web application with an integrated network scanner. The application requires privileged mode and host networking to perform network scans using masscan and nmap.
**Architecture:**
- **Web Application**: Flask app on port 5000
- **Database**: SQLite (persisted to volume)
- **Background Jobs**: APScheduler for async scan execution
- **Scanner**: masscan, nmap, sslyze, Playwright
---
## Prerequisites
### System Requirements
- **Operating System**: Linux (Ubuntu 20.04+, Debian 11+, or similar)
- **Docker**: Version 20.10+ or Docker Engine 24.0+
- **Docker Compose**: Version 2.0+ (or docker-compose 1.29+)
- **Memory**: Minimum 2GB RAM (4GB+ recommended)
- **Disk Space**: Minimum 5GB free space
- **Permissions**: Root/sudo access for Docker privileged mode
### Network Requirements
- Outbound internet access for Docker image downloads
- Access to target networks for scanning
- Port 5000 available on host (or configure alternative)
### Install Docker and Docker Compose
**Ubuntu/Debian:**
```bash
# Install Docker
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
# Add your user to docker group
sudo usermod -aG docker $USER
newgrp docker
# Verify installation
docker --version
docker compose version
```
**Other Linux distributions:** See [Docker installation guide](https://docs.docker.com/engine/install/)
---
## Quick Start
For users who want to get started immediately:
```bash
# 1. Clone the repository
git clone <repository-url>
cd SneakyScan
# 2. Create environment file
cp .env.example .env
# Edit .env and set SECRET_KEY and SNEAKYSCANNER_ENCRYPTION_KEY
nano .env
# 3. Build the Docker image
docker compose -f docker-compose-web.yml build
# 4. Initialize the database and set password
docker compose -f docker-compose-web.yml run --rm init-db --password "YourSecurePassword"
# 5. Start the application
docker compose -f docker-compose-web.yml up -d
# 6. Access the web interface
# Open browser to: http://localhost:5000
```
---
## Configuration
### Environment Variables
SneakyScanner is configured via environment variables. The recommended approach is to use a `.env` file.
#### Creating Your .env File
```bash
# Copy the example file
cp .env.example .env
# Generate secure keys
python3 -c "import secrets; print('SECRET_KEY=' + secrets.token_hex(32))" >> .env
python3 -c "from cryptography.fernet import Fernet; print('SNEAKYSCANNER_ENCRYPTION_KEY=' + Fernet.generate_key().decode())" >> .env
# Edit other settings as needed
nano .env
```
#### Key Configuration Options
| Variable | Description | Default | Required |
|----------|-------------|---------|----------|
| `FLASK_ENV` | Environment mode (`production` or `development`) | `production` | Yes |
| `FLASK_DEBUG` | Enable debug mode (`true` or `false`) | `false` | Yes |
| `SECRET_KEY` | Flask session secret (change in production!) | `dev-secret-key-change-in-production` | **Yes** |
| `SNEAKYSCANNER_ENCRYPTION_KEY` | Encryption key for sensitive settings | (empty) | **Yes** |
| `DATABASE_URL` | SQLite database path | `sqlite:////app/data/sneakyscanner.db` | Yes |
| `LOG_LEVEL` | Logging level (DEBUG, INFO, WARNING, ERROR) | `INFO` | No |
| `SCHEDULER_EXECUTORS` | Number of concurrent scan threads | `2` | No |
| `SCHEDULER_JOB_DEFAULTS_MAX_INSTANCES` | Max instances of same job | `3` | No |
| `CORS_ORIGINS` | CORS allowed origins (comma-separated) | `*` | No |
**Important Security Note:**
- **ALWAYS** change `SECRET_KEY` and `SNEAKYSCANNER_ENCRYPTION_KEY` in production
- Never commit `.env` file to version control
- Use strong, randomly-generated keys
---
## First-Time Setup
### Step 1: Prepare Directories
The application needs these directories (created automatically by Docker):
```bash
# Verify directories exist
ls -la configs/ data/ output/ logs/
# If missing, create them
mkdir -p configs data output logs
```
### Step 2: Configure Scan Targets
Create YAML configuration files for your scan targets:
```bash
# Example configuration
cat > configs/my-network.yaml <<EOF
title: "My Network Infrastructure"
sites:
- name: "Web Servers"
ips:
- address: "192.168.1.10"
expected:
ping: true
tcp_ports: [80, 443]
udp_ports: []
services: ["http", "https"]
EOF
```
### Step 3: Build Docker Image
```bash
# Build the image (takes 5-10 minutes on first run)
docker compose -f docker-compose-web.yml build
# Verify image was created
docker images | grep sneakyscanner
```
### Step 4: Initialize Database
The database must be initialized before first use:
```bash
# Initialize database and set application password
docker compose -f docker-compose-web.yml run --rm init-db --password "YourSecurePassword"
# The init-db command will:
# - Create database schema
# - Run all Alembic migrations
# - Set the application password
# - Create default settings
```
**Password Requirements:**
- Minimum 8 characters recommended
- Use a strong, unique password
- Store securely (password manager)
### Step 5: Verify Configuration
```bash
# Check database file was created
ls -lh data/sneakyscanner.db
# Verify Docker Compose configuration
docker compose -f docker-compose-web.yml config
```
---
## Running the Application
### Starting the Application
```bash
# Start in detached mode (background)
docker compose -f docker-compose-web.yml up -d
# View logs during startup
docker compose -f docker-compose-web.yml logs -f web
# Expected output:
# web_1 | INFO:werkzeug: * Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
```
### Accessing the Web Interface
1. Open browser to: **http://localhost:5000**
2. Login with the password you set during database initialization
3. Dashboard will display recent scans and statistics
### Stopping the Application
```bash
# Stop containers (preserves data)
docker compose -f docker-compose-web.yml down
# Stop and remove volumes (WARNING: deletes all data!)
docker compose -f docker-compose-web.yml down -v
```
### Restarting the Application
```bash
# Restart all services
docker compose -f docker-compose-web.yml restart
# Restart only the web service
docker compose -f docker-compose-web.yml restart web
```
### Viewing Logs
```bash
# View all logs
docker compose -f docker-compose-web.yml logs
# Follow logs in real-time
docker compose -f docker-compose-web.yml logs -f
# View last 100 lines
docker compose -f docker-compose-web.yml logs --tail=100
# View logs for specific service
docker compose -f docker-compose-web.yml logs web
```
---
## Volume Management
### Understanding Volumes
SneakyScanner uses several mounted volumes for data persistence:
| Volume | Container Path | Purpose | Important? |
|--------|----------------|---------|------------|
| `./configs` | `/app/configs` | Scan configuration files (read-only) | Yes |
| `./data` | `/app/data` | SQLite database | **Critical** |
| `./output` | `/app/output` | Scan results (JSON, HTML, ZIP) | Yes |
| `./logs` | `/app/logs` | Application logs | No |
### Backing Up Data
```bash
# Create backup directory
mkdir -p backups/$(date +%Y%m%d)
# Backup database
cp data/sneakyscanner.db backups/$(date +%Y%m%d)/
# Backup scan outputs
tar -czf backups/$(date +%Y%m%d)/output.tar.gz output/
# Backup configurations
tar -czf backups/$(date +%Y%m%d)/configs.tar.gz configs/
```
### Restoring Data
```bash
# Stop application
docker compose -f docker-compose-web.yml down
# Restore database
cp backups/YYYYMMDD/sneakyscanner.db data/
# Restore outputs
tar -xzf backups/YYYYMMDD/output.tar.gz
# Restart application
docker compose -f docker-compose-web.yml up -d
```
### Cleaning Up Old Scan Results
```bash
# Find old scan results (older than 30 days)
find output/ -type f -name "scan_report_*.json" -mtime +30
# Delete old scan results
find output/ -type f -name "scan_report_*" -mtime +30 -delete
# Or use the API to delete scans from UI/API
```
---
## Health Monitoring
### Health Check Endpoint
SneakyScanner includes a built-in health check endpoint:
```bash
# Check application health
curl http://localhost:5000/api/settings/health
# Expected response:
# {"status": "healthy"}
```
### Docker Health Status
```bash
# Check container health status
docker ps | grep sneakyscanner-web
# View health check logs
docker inspect sneakyscanner-web | grep -A 10 Health
```
### Monitoring Logs
```bash
# Watch for errors in logs
docker compose -f docker-compose-web.yml logs -f | grep ERROR
# Check application log file
tail -f logs/sneakyscanner.log
```
---
## Troubleshooting
### Container Won't Start
**Problem**: Container exits immediately after starting
```bash
# Check logs for errors
docker compose -f docker-compose-web.yml logs web
# Common issues:
# 1. Database not initialized - run init-db first
# 2. Permission issues with volumes - check directory ownership
# 3. Port 5000 already in use - change FLASK_PORT or stop conflicting service
```
### Database Initialization Fails
**Problem**: `init_db.py` fails with errors
```bash
# Check database directory permissions
ls -la data/
# Fix permissions if needed
sudo chown -R $USER:$USER data/
# Verify SQLite is accessible
sqlite3 data/sneakyscanner.db "SELECT 1;" 2>&1
# Remove corrupted database and reinitialize
rm data/sneakyscanner.db
docker compose -f docker-compose-web.yml run --rm init-db --password "YourPassword"
```
### Scans Fail with "Permission Denied"
**Problem**: Scanner cannot run masscan/nmap
```bash
# Verify container is running in privileged mode
docker inspect sneakyscanner-web | grep Privileged
# Should show: "Privileged": true
# Verify network mode is host
docker inspect sneakyscanner-web | grep NetworkMode
# Should show: "NetworkMode": "host"
# If not, verify docker-compose-web.yml has:
# privileged: true
# network_mode: host
```
### Can't Access Web Interface
**Problem**: Browser can't connect to http://localhost:5000
```bash
# Verify container is running
docker ps | grep sneakyscanner-web
# Check if Flask is listening
docker compose -f docker-compose-web.yml exec web netstat -tlnp | grep 5000
# Check firewall rules
sudo ufw status | grep 5000
# Try from container host
curl http://localhost:5000/api/settings/health
# Check logs for binding errors
docker compose -f docker-compose-web.yml logs web | grep -i bind
```
### Background Scans Not Running
**Problem**: Scans stay in "running" status forever
```bash
# Check scheduler is initialized
docker compose -f docker-compose-web.yml logs web | grep -i scheduler
# Check for job execution errors
docker compose -f docker-compose-web.yml logs web | grep -i "execute_scan"
# Verify APScheduler environment variables
docker compose -f docker-compose-web.yml exec web env | grep SCHEDULER
```
### Health Check Failing
**Problem**: Docker health check shows "unhealthy"
```bash
# Run health check manually
docker compose -f docker-compose-web.yml exec web \
python3 -c "import urllib.request; print(urllib.request.urlopen('http://localhost:5000/api/settings/health').read())"
# Check if health endpoint exists
curl -v http://localhost:5000/api/settings/health
# Common causes:
# 1. Application crashed - check logs
# 2. Database locked - check for long-running scans
# 3. Flask not fully started - wait 40s (start_period)
```
---
## Security Considerations
### Production Deployment Checklist
- [ ] Changed `SECRET_KEY` to random value
- [ ] Changed `SNEAKYSCANNER_ENCRYPTION_KEY` to random value
- [ ] Set strong application password
- [ ] Set `FLASK_ENV=production`
- [ ] Set `FLASK_DEBUG=false`
- [ ] Configured proper `CORS_ORIGINS` (not `*`)
- [ ] Using HTTPS/TLS (reverse proxy recommended)
- [ ] Restricted network access (firewall rules)
- [ ] Regular backups configured
- [ ] Log monitoring enabled
### Network Security
**Privileged Mode Considerations:**
- Container runs with `--privileged` flag for raw socket access (masscan/nmap)
- This grants extensive host capabilities - only run on trusted networks
- Restrict Docker host access with firewall rules
- Consider running on dedicated scan server
**Recommendations:**
```bash
# Restrict access to port 5000 with firewall
sudo ufw allow from 192.168.1.0/24 to any port 5000
sudo ufw enable
# Or use reverse proxy (nginx, Apache) with authentication
```
### HTTPS/TLS Setup
SneakyScanner does not include built-in TLS. For production, use a reverse proxy:
**Example nginx configuration:**
```nginx
server {
listen 443 ssl http2;
server_name scanner.example.com;
ssl_certificate /path/to/cert.pem;
ssl_certificate_key /path/to/key.pem;
location / {
proxy_pass http://localhost:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
### File Permissions
```bash
# Ensure proper ownership of data directories
sudo chown -R $USER:$USER data/ output/ logs/
# Restrict database file permissions
chmod 600 data/sneakyscanner.db
# Configs should be read-only
chmod 444 configs/*.yaml
```
---
## Upgrading
### Upgrading to New Version
```bash
# 1. Stop the application
docker compose -f docker-compose-web.yml down
# 2. Backup database
cp data/sneakyscanner.db data/sneakyscanner.db.backup
# 3. Pull latest code
git pull origin master
# 4. Rebuild Docker image
docker compose -f docker-compose-web.yml build
# 5. Run database migrations
docker compose -f docker-compose-web.yml run --rm web alembic upgrade head
# 6. Start application
docker compose -f docker-compose-web.yml up -d
# 7. Verify upgrade
docker compose -f docker-compose-web.yml logs -f
curl http://localhost:5000/api/settings/health
```
### Rolling Back
If upgrade fails:
```bash
# Stop new version
docker compose -f docker-compose-web.yml down
# Restore database backup
cp data/sneakyscanner.db.backup data/sneakyscanner.db
# Checkout previous version
git checkout <previous-version-tag>
# Rebuild and start
docker compose -f docker-compose-web.yml build
docker compose -f docker-compose-web.yml up -d
```
---
## Backup and Restore
### Automated Backup Script
Create `backup.sh`:
```bash
#!/bin/bash
BACKUP_DIR="backups/$(date +%Y%m%d_%H%M%S)"
mkdir -p "$BACKUP_DIR"
# Stop application for consistent backup
docker compose -f docker-compose-web.yml stop web
# Backup database
cp data/sneakyscanner.db "$BACKUP_DIR/"
# Backup outputs (last 30 days only)
find output/ -type f -mtime -30 -exec cp --parents {} "$BACKUP_DIR/" \;
# Backup configs
cp -r configs/ "$BACKUP_DIR/"
# Restart application
docker compose -f docker-compose-web.yml start web
echo "Backup complete: $BACKUP_DIR"
```
Make executable and schedule with cron:
```bash
chmod +x backup.sh
# Add to crontab (daily at 2 AM)
crontab -e
# Add line:
0 2 * * * /path/to/SneakyScan/backup.sh
```
### Restore from Backup
```bash
# Stop application
docker compose -f docker-compose-web.yml down
# Restore files
cp backups/YYYYMMDD_HHMMSS/sneakyscanner.db data/
cp -r backups/YYYYMMDD_HHMMSS/configs/* configs/
cp -r backups/YYYYMMDD_HHMMSS/output/* output/
# Start application
docker compose -f docker-compose-web.yml up -d
```
---
## Support and Further Reading
- **Project README**: `README.md` - General project information
- **API Documentation**: `docs/ai/API_REFERENCE.md` - REST API reference
- **Developer Guide**: `docs/ai/DEVELOPMENT.md` - Development setup and architecture
- **Phase 2 Documentation**: `docs/ai/PHASE2.md` - Implementation details
- **Issue Tracker**: File bugs and feature requests on GitHub
---
**Last Updated**: 2025-11-14
**Version**: Phase 2 - Web Application Complete

View File

@@ -1,876 +0,0 @@
# SneakyScanner Phase 2 - Manual Testing Checklist
**Version:** 2.0 (Phase 2)
**Last Updated:** 2025-11-14
This document provides a comprehensive manual testing checklist for validating the SneakyScanner web application. Use this checklist to verify all features work correctly before deployment or release.
---
## Table of Contents
1. [Prerequisites](#prerequisites)
2. [Deployment & Startup](#deployment--startup)
3. [Authentication](#authentication)
4. [Scan Management (Web UI)](#scan-management-web-ui)
5. [Scan Management (API)](#scan-management-api)
6. [Error Handling](#error-handling)
7. [Performance & Concurrency](#performance--concurrency)
8. [Data Persistence](#data-persistence)
9. [Security](#security)
10. [Cleanup](#cleanup)
---
## Prerequisites
Before starting manual testing:
- [ ] Docker and Docker Compose installed
- [ ] `.env` file configured with proper keys
- [ ] Test scan configuration available (e.g., `configs/example-site.yaml`)
- [ ] Network access for scanning (if using real targets)
- [ ] Browser for web UI testing (Chrome, Firefox, Safari, Edge)
- [ ] `curl` and `jq` for API testing
- [ ] At least 2GB free disk space for scan results
**Recommended Test Environment:**
- Clean database (no existing scans)
- Test config with 1-2 IPs, 2-3 expected ports
- Expected scan duration: 1-3 minutes
---
## Deployment & Startup
### Test 1: Environment Configuration
**Objective:** Verify environment variables are properly configured.
**Steps:**
1. Check `.env` file exists:
```bash
ls -la .env
```
2. Verify required keys are set (not defaults):
```bash
grep SECRET_KEY .env
grep SNEAKYSCANNER_ENCRYPTION_KEY .env
```
3. Verify keys are not default values:
```bash
grep -v "your-secret-key-here" .env | grep SECRET_KEY
```
**Expected Result:**
- [ ] `.env` file exists
- [ ] `SECRET_KEY` is set to unique value (not `your-secret-key-here`)
- [ ] `SNEAKYSCANNER_ENCRYPTION_KEY` is set to unique value
- [ ] All required environment variables present
### Test 2: Docker Compose Startup
**Objective:** Verify web application starts successfully.
**Steps:**
1. Start services:
```bash
docker-compose -f docker-compose-web.yml up -d
```
2. Check container status:
```bash
docker-compose -f docker-compose-web.yml ps
```
3. Check logs for errors:
```bash
docker-compose -f docker-compose-web.yml logs web | tail -50
```
4. Wait 30 seconds for healthcheck to pass
**Expected Result:**
- [ ] Container starts without errors
- [ ] Status shows "Up" or "healthy"
- [ ] No error messages in logs
- [ ] Port 5000 is listening
### Test 3: Health Check
**Objective:** Verify health check endpoint responds correctly.
**Steps:**
1. Call health endpoint:
```bash
curl -s http://localhost:5000/api/settings/health | jq '.'
```
**Expected Result:**
- [ ] HTTP 200 status code
- [ ] Response: `{"status": "healthy", "database": "connected"}`
- [ ] No authentication required
### Test 4: Database Initialization
**Objective:** Verify database was created and initialized.
**Steps:**
1. Check database file exists:
```bash
docker exec sneakyscanner_web ls -lh /app/data/sneakyscanner.db
```
2. Verify database has tables:
```bash
docker exec sneakyscanner_web sqlite3 /app/data/sneakyscanner.db ".tables"
```
**Expected Result:**
- [ ] Database file exists (`sneakyscanner.db`)
- [ ] Database file size > 0 bytes
- [ ] All 11 tables present: `scans`, `scan_sites`, `scan_ips`, `scan_ports`, `scan_services`, `scan_certificates`, `scan_tls_versions`, `schedules`, `alerts`, `alert_rules`, `settings`
---
## Authentication
### Test 5: Login Page Access
**Objective:** Verify unauthenticated users are redirected to login.
**Steps:**
1. Open browser to http://localhost:5000/dashboard (without logging in)
2. Observe redirect
**Expected Result:**
- [ ] Redirected to http://localhost:5000/login
- [ ] Login page displays correctly
- [ ] Dark theme applied (slate/grey colors)
- [ ] Username and password fields visible
- [ ] "Login" button visible
### Test 6: Login with Correct Password
**Objective:** Verify successful login flow.
**Steps:**
1. Navigate to http://localhost:5000/login
2. Enter password (default: `admin`)
3. Click "Login" button
**Expected Result:**
- [ ] Redirected to http://localhost:5000/dashboard
- [ ] No error messages
- [ ] Navigation bar shows "Dashboard", "Scans", "Settings", "Logout"
- [ ] Welcome message displayed
### Test 7: Login with Incorrect Password
**Objective:** Verify failed login handling.
**Steps:**
1. Navigate to http://localhost:5000/login
2. Enter incorrect password (e.g., `wrongpassword`)
3. Click "Login" button
**Expected Result:**
- [ ] Stays on login page (no redirect)
- [ ] Error message displayed: "Invalid password"
- [ ] Password field cleared
- [ ] Can retry login
### Test 8: Logout
**Objective:** Verify logout destroys session.
**Steps:**
1. Login successfully
2. Navigate to http://localhost:5000/dashboard
3. Click "Logout" in navigation bar
4. Try to access http://localhost:5000/dashboard again
**Expected Result:**
- [ ] Logout redirects to login page
- [ ] Flash message: "Logged out successfully"
- [ ] Session destroyed (redirected to login when accessing protected pages)
- [ ] Cannot access dashboard without re-logging in
### Test 9: API Authentication (Session Cookie)
**Objective:** Verify API endpoints require authentication.
**Steps:**
1. Call API endpoint without authentication:
```bash
curl -i http://localhost:5000/api/scans
```
2. Login and save session cookie:
```bash
curl -X POST http://localhost:5000/auth/login \
-H "Content-Type: application/json" \
-d '{"password":"admin"}' \
-c cookies.txt
```
3. Call API endpoint with session cookie:
```bash
curl -b cookies.txt http://localhost:5000/api/scans
```
**Expected Result:**
- [ ] Request without auth returns 401 Unauthorized
- [ ] Login returns 200 OK with session cookie
- [ ] Request with auth cookie returns 200 OK with scan data
---
## Scan Management (Web UI)
### Test 10: Dashboard Display
**Objective:** Verify dashboard loads and displays correctly.
**Steps:**
1. Login successfully
2. Navigate to http://localhost:5000/dashboard
3. Observe page content
**Expected Result:**
- [ ] Dashboard loads without errors
- [ ] Welcome message displayed
- [ ] "Run Scan Now" button visible
- [ ] Recent scans section visible (may be empty)
- [ ] Navigation works
### Test 11: Trigger Scan via Web UI
**Objective:** Verify scan can be triggered from dashboard.
**Steps:**
1. Login and go to dashboard
2. Click "Run Scan Now" button
3. Observe scan starts
4. Wait for scan to complete (1-3 minutes)
**Expected Result:**
- [ ] Scan starts (status shows "Running")
- [ ] Scan appears in recent scans list
- [ ] Scan ID assigned and displayed
- [ ] Status updates to "Completed" after scan finishes
- [ ] No error messages
**Note:** If "Run Scan Now" button not yet implemented, use API to trigger scan (Test 15).
### Test 12: View Scan List
**Objective:** Verify scan list page displays correctly.
**Steps:**
1. Login successfully
2. Navigate to http://localhost:5000/scans
3. Trigger at least 3 scans (via API or UI)
4. Refresh scan list page
**Expected Result:**
- [ ] Scan list page loads
- [ ] All scans displayed in table
- [ ] Columns: ID, Timestamp, Title, Status, Actions
- [ ] Pagination controls visible (if > 20 scans)
- [ ] Each scan has "View" and "Delete" buttons
### Test 13: View Scan Details
**Objective:** Verify scan detail page displays complete results.
**Steps:**
1. From scan list, click "View" on a completed scan
2. Observe scan details page
**Expected Result:**
- [ ] Scan details page loads (http://localhost:5000/scans/{id})
- [ ] Scan metadata displayed (ID, timestamp, duration, status)
- [ ] Sites section visible
- [ ] IPs section visible with ping status
- [ ] Ports section visible (TCP/UDP)
- [ ] Services section visible with product/version
- [ ] HTTPS services show certificate details (if applicable)
- [ ] TLS versions displayed (if applicable)
- [ ] Screenshot links work (if screenshots captured)
- [ ] Download buttons for JSON/HTML/ZIP files
### Test 14: Delete Scan via Web UI
**Objective:** Verify scan deletion removes all data and files.
**Steps:**
1. Login and navigate to scan list
2. Note a scan ID to delete
3. Click "Delete" button on scan
4. Confirm deletion
5. Check database and filesystem
**Expected Result:**
- [ ] Confirmation prompt appears
- [ ] After confirmation, scan removed from list
- [ ] Scan no longer appears in database
- [ ] JSON/HTML/ZIP files deleted from filesystem
- [ ] Screenshot directory deleted
- [ ] Success message displayed
---
## Scan Management (API)
### Test 15: Trigger Scan via API
**Objective:** Verify scan can be triggered via REST API.
**Steps:**
1. Login and save session cookie (see Test 9)
2. Trigger scan:
```bash
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/example-site.yaml"}' \
-b cookies.txt | jq '.'
```
3. Note the `scan_id` from response
**Expected Result:**
- [ ] HTTP 201 Created response
- [ ] Response includes `scan_id` (integer)
- [ ] Response includes `status: "running"`
- [ ] Response includes `message: "Scan queued successfully"`
### Test 16: Poll Scan Status
**Objective:** Verify scan status can be polled via API.
**Steps:**
1. Trigger a scan (Test 15) and note `scan_id`
2. Poll status immediately:
```bash
curl -b cookies.txt http://localhost:5000/api/scans/{scan_id}/status | jq '.'
```
3. Wait 30 seconds and poll again
4. Continue polling until status is `completed` or `failed`
**Expected Result:**
- [ ] Initial status: `"running"`
- [ ] Response includes `started_at` timestamp
- [ ] Response includes `completed_at: null` while running
- [ ] After completion: status changes to `"completed"` or `"failed"`
- [ ] `completed_at` timestamp set when done
- [ ] If failed, `error_message` is present
### Test 17: Get Scan Details via API
**Objective:** Verify complete scan details can be retrieved via API.
**Steps:**
1. Trigger a scan and wait for completion
2. Get scan details:
```bash
curl -b cookies.txt http://localhost:5000/api/scans/{scan_id} | jq '.'
```
**Expected Result:**
- [ ] HTTP 200 OK response
- [ ] Response includes all scan metadata (id, timestamp, duration, status, title)
- [ ] Response includes file paths (json_path, html_path, zip_path, screenshot_dir)
- [ ] Response includes `sites` array
- [ ] Each site includes `ips` array
- [ ] Each IP includes `ports` array
- [ ] Each port includes `services` array
- [ ] HTTPS services include `certificates` array (if applicable)
- [ ] Certificates include `tls_versions` array (if applicable)
- [ ] All relationships properly nested
### Test 18: List Scans with Pagination
**Objective:** Verify scan list API supports pagination.
**Steps:**
1. Trigger at least 25 scans
2. List first page:
```bash
curl -b cookies.txt "http://localhost:5000/api/scans?page=1&per_page=20" | jq '.'
```
3. List second page:
```bash
curl -b cookies.txt "http://localhost:5000/api/scans?page=2&per_page=20" | jq '.'
```
**Expected Result:**
- [ ] First page returns 20 scans
- [ ] Response includes `total` (total count)
- [ ] Response includes `page: 1` and `pages` (total pages)
- [ ] Response includes `per_page: 20`
- [ ] Second page returns remaining scans
- [ ] No duplicate scans between pages
### Test 19: Filter Scans by Status
**Objective:** Verify scan list can be filtered by status.
**Steps:**
1. Trigger scans with different statuses (running, completed, failed)
2. Filter by running:
```bash
curl -b cookies.txt "http://localhost:5000/api/scans?status=running" | jq '.'
```
3. Filter by completed:
```bash
curl -b cookies.txt "http://localhost:5000/api/scans?status=completed" | jq '.'
```
4. Filter by failed:
```bash
curl -b cookies.txt "http://localhost:5000/api/scans?status=failed" | jq '.'
```
**Expected Result:**
- [ ] Each filter returns only scans with matching status
- [ ] Total count reflects filtered results
- [ ] Empty status filter returns all scans
### Test 20: Delete Scan via API
**Objective:** Verify scan deletion via REST API.
**Steps:**
1. Trigger a scan and wait for completion
2. Note the `scan_id`
3. Delete scan:
```bash
curl -X DELETE -b cookies.txt http://localhost:5000/api/scans/{scan_id} | jq '.'
```
4. Verify deletion:
```bash
curl -b cookies.txt http://localhost:5000/api/scans/{scan_id}
```
5. Check filesystem for scan files
**Expected Result:**
- [ ] Delete returns HTTP 200 OK
- [ ] Delete response: `{"message": "Scan {id} deleted successfully"}`
- [ ] Subsequent GET returns HTTP 404 Not Found
- [ ] JSON/HTML/ZIP files deleted from filesystem
- [ ] Screenshot directory deleted
- [ ] Database record removed
---
## Error Handling
### Test 21: Invalid Config File
**Objective:** Verify proper error handling for invalid config files.
**Steps:**
1. Trigger scan with non-existent config:
```bash
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/nonexistent.yaml"}' \
-b cookies.txt | jq '.'
```
**Expected Result:**
- [ ] HTTP 400 Bad Request
- [ ] Response includes `error` and `message` fields
- [ ] Error message indicates config file invalid/not found
- [ ] No scan record created in database
### Test 22: Missing Required Field
**Objective:** Verify API validates required fields.
**Steps:**
1. Trigger scan without config_file:
```bash
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{}' \
-b cookies.txt | jq '.'
```
**Expected Result:**
- [ ] HTTP 400 Bad Request
- [ ] Error message indicates missing required field
### Test 23: Non-Existent Scan ID
**Objective:** Verify 404 handling for non-existent scans.
**Steps:**
1. Get scan with invalid ID:
```bash
curl -b cookies.txt http://localhost:5000/api/scans/99999 | jq '.'
```
**Expected Result:**
- [ ] HTTP 404 Not Found
- [ ] Response: `{"error": "Scan not found", "message": "Scan with ID 99999 does not exist"}`
### Test 24: Invalid Pagination Parameters
**Objective:** Verify pagination parameter validation.
**Steps:**
1. Request with invalid page number:
```bash
curl -b cookies.txt "http://localhost:5000/api/scans?page=-1" | jq '.'
```
2. Request with invalid per_page:
```bash
curl -b cookies.txt "http://localhost:5000/api/scans?per_page=1000" | jq '.'
```
**Expected Result:**
- [ ] HTTP 400 Bad Request for negative page
- [ ] per_page capped at maximum (100)
- [ ] Error message indicates validation failure
### Test 25: Content Negotiation
**Objective:** Verify API returns JSON and web UI returns HTML for errors.
**Steps:**
1. Access non-existent scan via API:
```bash
curl -H "Accept: application/json" http://localhost:5000/api/scans/99999
```
2. Access non-existent scan via browser:
- Open http://localhost:5000/scans/99999 in browser
**Expected Result:**
- [ ] API request returns JSON error
- [ ] Browser request returns HTML error page
- [ ] HTML error page matches dark theme
- [ ] HTML error page has navigation back to dashboard
### Test 26: Error Templates
**Objective:** Verify custom error templates render correctly.
**Steps:**
1. Trigger 400 error (bad request)
2. Trigger 401 error (unauthorized - access API without login)
3. Trigger 404 error (non-existent page - http://localhost:5000/nonexistent)
4. Trigger 405 error (method not allowed - POST to GET-only endpoint)
**Expected Result:**
- [ ] Each error displays custom error page
- [ ] Error pages use dark theme
- [ ] Error pages include error code and message
- [ ] Error pages have "Back to Dashboard" link
- [ ] Navigation bar visible on error pages (if authenticated)
### Test 27: Request ID Tracking
**Objective:** Verify request IDs are generated and included in responses.
**Steps:**
1. Make API request and check headers:
```bash
curl -i -b cookies.txt http://localhost:5000/api/scans
```
**Expected Result:**
- [ ] Response includes `X-Request-ID` header
- [ ] Request ID is 8-character hex string
- [ ] Response includes `X-Request-Duration-Ms` header
- [ ] Duration is positive integer (milliseconds)
### Test 28: Logging
**Objective:** Verify requests are logged with request IDs.
**Steps:**
1. Make API request
2. Check logs:
```bash
docker-compose -f docker-compose-web.yml logs web | tail -20
```
**Expected Result:**
- [ ] Logs include request ID in brackets `[a1b2c3d4]`
- [ ] Logs include HTTP method, path, status code
- [ ] Logs include request duration in milliseconds
- [ ] Error logs include stack traces (if applicable)
---
## Performance & Concurrency
### Test 29: Concurrent Scans
**Objective:** Verify multiple scans can run concurrently.
**Steps:**
1. Trigger 3 scans simultaneously:
```bash
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/example-site.yaml"}' \
-b cookies.txt &
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/example-site.yaml"}' \
-b cookies.txt &
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/example-site.yaml"}' \
-b cookies.txt &
```
2. Check all scans are running:
```bash
curl -b cookies.txt "http://localhost:5000/api/scans?status=running" | jq '.total'
```
**Expected Result:**
- [ ] All 3 scans start successfully
- [ ] All 3 scans have status "running"
- [ ] No database locking errors in logs
- [ ] All 3 scans eventually complete
### Test 30: API Responsiveness During Scan
**Objective:** Verify web UI and API remain responsive during long-running scans.
**Steps:**
1. Trigger a long-running scan (5+ minutes)
2. While scan is running, perform these actions:
- Navigate to dashboard
- List scans via API
- Get scan status via API
- Login/logout
**Expected Result:**
- [ ] Web UI loads quickly (< 2 seconds)
- [ ] API requests respond quickly (< 500ms)
- [ ] No timeouts or slow responses
- [ ] Background scan does not block HTTP requests
---
## Data Persistence
### Test 31: Database Persistence Across Restarts
**Objective:** Verify database persists across container restarts.
**Steps:**
1. Trigger a scan and wait for completion
2. Note the scan ID
3. Restart container:
```bash
docker-compose -f docker-compose-web.yml restart web
```
4. Wait for container to restart (check health)
5. Query scan via API
**Expected Result:**
- [ ] Container restarts successfully
- [ ] Database file persists
- [ ] Scan still accessible after restart
- [ ] All scan data intact
### Test 32: File Persistence
**Objective:** Verify scan files persist in volume.
**Steps:**
1. Trigger a scan and wait for completion
2. Note the file paths (JSON, HTML, ZIP, screenshots)
3. Verify files exist:
```bash
docker exec sneakyscanner_web ls -lh /app/output/scan_report_*.json
```
4. Restart container
5. Verify files still exist
**Expected Result:**
- [ ] All scan files created (JSON, HTML, ZIP, screenshots)
- [ ] Files persist after container restart
- [ ] Files accessible from host (mounted volume)
- [ ] File sizes are non-zero
---
## Security
### Test 33: Password Hashing
**Objective:** Verify passwords are hashed with bcrypt.
**Steps:**
1. Check password in database:
```bash
docker exec sneakyscanner_web sqlite3 /app/data/sneakyscanner.db \
"SELECT value FROM settings WHERE key='app_password';"
```
**Expected Result:**
- [ ] Password is not stored in plaintext
- [ ] Password starts with `$2b$` (bcrypt hash)
- [ ] Hash is ~60 characters long
### Test 34: Session Cookie Security
**Objective:** Verify session cookies have secure attributes (in production).
**Steps:**
1. Login via browser (with developer tools open)
2. Inspect cookies (Application > Cookies)
3. Check session cookie attributes
**Expected Result:**
- [ ] Session cookie has `HttpOnly` flag
- [ ] Session cookie has `Secure` flag (if HTTPS)
- [ ] Session cookie has `SameSite` attribute
- [ ] Session cookie expires on logout
### Test 35: SQL Injection Protection
**Objective:** Verify inputs are sanitized against SQL injection.
**Steps:**
1. Attempt SQL injection in scan list filter:
```bash
curl -b cookies.txt "http://localhost:5000/api/scans?status='; DROP TABLE scans; --"
```
2. Check database is intact:
```bash
docker exec sneakyscanner_web sqlite3 /app/data/sneakyscanner.db ".tables"
```
**Expected Result:**
- [ ] No SQL injection occurs
- [ ] Database tables intact
- [ ] API returns validation error or empty results
- [ ] No database errors in logs
### Test 36: File Path Traversal Protection
**Objective:** Verify config file paths are validated against path traversal.
**Steps:**
1. Attempt path traversal in config_file:
```bash
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"../../../etc/passwd"}' \
-b cookies.txt
```
**Expected Result:**
- [ ] Request rejected with 400 Bad Request
- [ ] Error message indicates invalid config file
- [ ] No file outside /app/configs accessed
- [ ] Security error logged
---
## Cleanup
### Test 37: Stop Services
**Objective:** Gracefully stop all services.
**Steps:**
1. Stop services:
```bash
docker-compose -f docker-compose-web.yml down
```
2. Verify containers stopped:
```bash
docker-compose -f docker-compose-web.yml ps
```
**Expected Result:**
- [ ] Services stop gracefully (no kill signals)
- [ ] All containers stopped
- [ ] No error messages in logs
- [ ] Volumes preserved (data, output, logs, configs)
### Test 38: Volume Cleanup (Optional)
**Objective:** Remove all data volumes (only if needed).
**Steps:**
1. Stop and remove volumes:
```bash
docker-compose -f docker-compose-web.yml down -v
```
2. Verify volumes removed:
```bash
docker volume ls | grep sneakyscanner
```
**Expected Result:**
- [ ] All volumes removed
- [ ] Database deleted
- [ ] Scan results deleted
- [ ] Logs deleted
**Warning:** This is destructive and removes all data!
---
## Summary
### Test Results Summary
Total Tests: 38
| Category | Tests | Passed | Failed |
|----------|-------|--------|--------|
| Deployment & Startup | 4 | | |
| Authentication | 5 | | |
| Scan Management (Web UI) | 5 | | |
| Scan Management (API) | 6 | | |
| Error Handling | 8 | | |
| Performance & Concurrency | 2 | | |
| Data Persistence | 2 | | |
| Security | 4 | | |
| Cleanup | 2 | | |
| **Total** | **38** | | |
### Critical Tests (Must Pass)
These tests are critical and must pass for Phase 2 to be considered complete:
- [ ] Test 2: Docker Compose Startup
- [ ] Test 3: Health Check
- [ ] Test 6: Login with Correct Password
- [ ] Test 15: Trigger Scan via API
- [ ] Test 16: Poll Scan Status
- [ ] Test 17: Get Scan Details via API
- [ ] Test 18: List Scans with Pagination
- [ ] Test 20: Delete Scan via API
- [ ] Test 29: Concurrent Scans
- [ ] Test 31: Database Persistence Across Restarts
### Known Issues
Document any known issues or test failures here:
1. **Issue:** [Description]
- **Severity:** Critical | High | Medium | Low
- **Workaround:** [Workaround if available]
- **Fix:** [Planned fix]
---
## Notes
- Tests should be run in order, as later tests may depend on earlier setup
- Some tests require multiple scans - consider batch creating scans for efficiency
- Performance tests are environment-dependent (Docker resources, network speed)
- Security tests are basic - professional security audit recommended for production
- Manual testing complements automated tests - both are important
---
**Manual Testing Checklist Version:** 1.0
**Phase:** 2 - Flask Web App Core
**Last Updated:** 2025-11-14

View File

@@ -1,404 +0,0 @@
# Phase 1: Foundation - COMPLETE ✓
**Date Completed:** 2025-11-13
Phase 1 of the SneakyScanner roadmap has been successfully implemented. This document summarizes what was delivered and how to use the new infrastructure.
---
## ✓ Deliverables Completed
### 1. Database Schema & Models
- **SQLAlchemy models** for all 11 database tables (`web/models.py`)
- Core tables: `Scan`, `ScanSite`, `ScanIP`, `ScanPort`, `ScanService`, `ScanCertificate`, `ScanTLSVersion`
- Scheduling tables: `Schedule`, `Alert`, `AlertRule`
- Configuration: `Setting`
- **Alembic migrations** system configured (`migrations/`)
- **Initial migration** created (`migrations/versions/001_initial_schema.py`)
### 2. Settings System with Encryption
- **SettingsManager** class with CRUD operations (`web/utils/settings.py`)
- **Automatic encryption** for sensitive values (SMTP passwords, API tokens)
- **PasswordManager** for bcrypt password hashing
- **Default settings initialization** for SMTP, authentication, retention policies
### 3. Flask Application Structure
- **Flask app factory** pattern implemented (`web/app.py`)
- **API blueprints** for all major endpoints:
- `/api/scans` - Scan management (stub for Phase 2)
- `/api/schedules` - Schedule management (stub for Phase 3)
- `/api/alerts` - Alert management (stub for Phase 4)
- `/api/settings` - Settings API (functional in Phase 1!)
- **Error handlers** for common HTTP status codes
- **CORS support** for API access
- **Logging** to file and console
- **Database session management** with scoped sessions
### 4. Database Initialization
- **init_db.py** script for easy database setup
- Supports both Alembic migrations and direct table creation
- Password setting during initialization
- Database verification and settings display
### 5. Docker Support
- **Updated Dockerfile** with Flask dependencies
- **docker-compose-web.yml** for running the web application
- Separate service definition for database initialization
- Volume mounts for persistence (database, output, logs)
### 6. Validation & Testing
- **validate_phase1.py** script to verify all deliverables
- Validates directory structure, files, Python syntax, models, and API endpoints
- All checks passing ✓
---
## 📁 New Project Structure
```
SneakyScanner/
├── web/ # Flask web application (NEW)
│ ├── __init__.py
│ ├── app.py # Flask app factory
│ ├── models.py # SQLAlchemy models (11 tables)
│ ├── api/ # API blueprints
│ │ ├── __init__.py
│ │ ├── scans.py # Scans API
│ │ ├── schedules.py # Schedules API
│ │ ├── alerts.py # Alerts API
│ │ └── settings.py # Settings API (functional!)
│ ├── templates/ # Jinja2 templates (for Phase 3)
│ ├── static/ # CSS, JS, images (for Phase 3)
│ │ ├── css/
│ │ ├── js/
│ │ └── images/
│ └── utils/ # Utility modules
│ ├── __init__.py
│ └── settings.py # Settings manager with encryption
├── migrations/ # Alembic migrations (NEW)
│ ├── env.py # Alembic environment
│ ├── script.py.mako # Migration template
│ └── versions/
│ └── 001_initial_schema.py # Initial database migration
├── alembic.ini # Alembic configuration (NEW)
├── init_db.py # Database initialization script (NEW)
├── validate_phase1.py # Phase 1 validation script (NEW)
├── requirements-web.txt # Flask dependencies (NEW)
├── docker-compose-web.yml # Docker Compose for web app (NEW)
├── Dockerfile # Updated with Flask support
├── src/ # Existing scanner code (unchanged)
├── templates/ # Existing report templates (unchanged)
├── configs/ # Existing YAML configs (unchanged)
└── output/ # Existing scan outputs (unchanged)
```
---
## 🚀 Getting Started
### Option 1: Local Development (without Docker)
#### 1. Install Dependencies
```bash
# Install Flask and web dependencies
pip install -r requirements-web.txt
```
#### 2. Initialize Database
```bash
# Create database and set password
python3 init_db.py --password YOUR_SECURE_PASSWORD
# Verify database
python3 init_db.py --verify-only
```
#### 3. Run Flask Application
```bash
# Run development server
python3 -m web.app
# Application will be available at http://localhost:5000
```
#### 4. Test API Endpoints
```bash
# Health check
curl http://localhost:5000/api/settings/health
# Get all settings (sanitized)
curl http://localhost:5000/api/settings
# Get specific setting
curl http://localhost:5000/api/settings/smtp_server
# Update a setting
curl -X PUT http://localhost:5000/api/settings/smtp_server \
-H "Content-Type: application/json" \
-d '{"value": "smtp.gmail.com"}'
# Set application password
curl -X POST http://localhost:5000/api/settings/password \
-H "Content-Type: application/json" \
-d '{"password": "newsecurepassword"}'
```
---
### Option 2: Docker Deployment
#### 1. Build Docker Image
```bash
docker-compose -f docker-compose-web.yml build
```
#### 2. Initialize Database (one-time)
```bash
# Create data directory
mkdir -p data
# Initialize database
docker-compose -f docker-compose-web.yml run --rm init-db --password YOUR_SECURE_PASSWORD
```
#### 3. Run Web Application
```bash
# Start Flask web server
docker-compose -f docker-compose-web.yml up -d web
# View logs
docker-compose -f docker-compose-web.yml logs -f web
```
#### 4. Access Application
- Web API: http://localhost:5000
- Health checks:
- http://localhost:5000/api/scans/health
- http://localhost:5000/api/schedules/health
- http://localhost:5000/api/alerts/health
- http://localhost:5000/api/settings/health
---
## 🔐 Security Features
### Encryption
- **Fernet encryption** for sensitive settings (SMTP passwords, API tokens)
- Encryption key auto-generated and stored in settings table
- Can be overridden via `SNEAKYSCANNER_ENCRYPTION_KEY` environment variable
### Password Hashing
- **Bcrypt** for application password hashing (work factor 12)
- Password stored as irreversible hash in settings table
- Minimum 8 characters enforced
### Session Management
- Flask sessions with configurable `SECRET_KEY`
- Set via environment variable or config
---
## 📊 Database Schema
### Core Tables
- **scans** - Scan metadata and status
- **scan_sites** - Site groupings
- **scan_ips** - IP addresses scanned
- **scan_ports** - Discovered ports
- **scan_services** - Service detection results
- **scan_certificates** - SSL/TLS certificates
- **scan_tls_versions** - TLS version support
### Scheduling & Alerts
- **schedules** - Cron-like scan schedules
- **alerts** - Alert history
- **alert_rules** - Alert rule definitions
### Configuration
- **settings** - Application settings (key-value store)
All tables include proper foreign keys, indexes, and cascade delete rules.
---
## 🧪 Validation
Run the Phase 1 validation script to verify everything is in place:
```bash
python3 validate_phase1.py
```
Expected output:
```
✓ All Phase 1 validation checks passed!
```
---
## 🔧 Environment Variables
Configure the Flask app via environment variables:
```bash
# Flask configuration
export FLASK_ENV=development
export FLASK_DEBUG=true
export FLASK_HOST=0.0.0.0
export FLASK_PORT=5000
# Database
export DATABASE_URL=sqlite:///./sneakyscanner.db
# Security
export SECRET_KEY=your-secret-key-here
export SNEAKYSCANNER_ENCRYPTION_KEY=your-encryption-key-here
# CORS (comma-separated origins)
export CORS_ORIGINS=http://localhost:3000,https://your-domain.com
# Logging
export LOG_LEVEL=INFO
```
Or use a `.env` file (supported via `python-dotenv`).
---
## 📝 API Endpoints Summary
### Settings API (Functional in Phase 1)
| Method | Endpoint | Description | Status |
|--------|----------|-------------|--------|
| GET | `/api/settings` | Get all settings (sanitized) | ✓ Working |
| PUT | `/api/settings` | Update multiple settings | ✓ Working |
| GET | `/api/settings/{key}` | Get specific setting | ✓ Working |
| PUT | `/api/settings/{key}` | Update specific setting | ✓ Working |
| DELETE | `/api/settings/{key}` | Delete setting | ✓ Working |
| POST | `/api/settings/password` | Set app password | ✓ Working |
| GET | `/api/settings/health` | Health check | ✓ Working |
### Scans API (Stubs for Phase 2)
| Method | Endpoint | Description | Status |
|--------|----------|-------------|--------|
| GET | `/api/scans` | List scans | Phase 2 |
| GET | `/api/scans/{id}` | Get scan details | Phase 2 |
| POST | `/api/scans` | Trigger scan | Phase 2 |
| DELETE | `/api/scans/{id}` | Delete scan | Phase 2 |
| GET | `/api/scans/{id}/status` | Get scan status | Phase 2 |
| GET | `/api/scans/health` | Health check | ✓ Working |
### Schedules API (Stubs for Phase 3)
| Method | Endpoint | Description | Status |
|--------|----------|-------------|--------|
| GET | `/api/schedules` | List schedules | Phase 3 |
| POST | `/api/schedules` | Create schedule | Phase 3 |
| PUT | `/api/schedules/{id}` | Update schedule | Phase 3 |
| DELETE | `/api/schedules/{id}` | Delete schedule | Phase 3 |
| POST | `/api/schedules/{id}/trigger` | Trigger schedule | Phase 3 |
| GET | `/api/schedules/health` | Health check | ✓ Working |
### Alerts API (Stubs for Phase 4)
| Method | Endpoint | Description | Status |
|--------|----------|-------------|--------|
| GET | `/api/alerts` | List alerts | Phase 4 |
| GET | `/api/alerts/rules` | List alert rules | Phase 4 |
| POST | `/api/alerts/rules` | Create alert rule | Phase 4 |
| PUT | `/api/alerts/rules/{id}` | Update alert rule | Phase 4 |
| DELETE | `/api/alerts/rules/{id}` | Delete alert rule | Phase 4 |
| GET | `/api/alerts/health` | Health check | ✓ Working |
---
## ✅ Testing Checklist
- [x] Database creates successfully
- [x] Settings can be stored/retrieved
- [x] Encryption works for sensitive values
- [x] Password hashing works
- [x] Flask app starts without errors
- [x] API blueprints load correctly
- [x] Health check endpoints respond
- [x] All Python files have valid syntax
- [x] All models defined correctly
- [x] Database migrations work
---
## 🎯 Next Steps: Phase 2
Phase 2 will implement:
1. **REST API for scans** - Trigger scans, list history, get results
2. **Background job queue** - APScheduler for async scan execution
3. **Authentication** - Flask-Login for session management
4. **Scanner integration** - Save scan results to database
5. **Docker Compose deployment** - Production-ready setup
Estimated timeline: 2 weeks (as per roadmap)
---
## 📚 References
### Key Files
- `web/models.py` - Database models (lines 1-400+)
- `web/app.py` - Flask app factory (lines 1-250+)
- `web/utils/settings.py` - Settings manager (lines 1-300+)
- `init_db.py` - Database initialization (lines 1-200+)
- `migrations/versions/001_initial_schema.py` - Initial migration (lines 1-250+)
### Documentation
- [Flask Documentation](https://flask.palletsprojects.com/)
- [SQLAlchemy ORM](https://docs.sqlalchemy.org/)
- [Alembic Migrations](https://alembic.sqlalchemy.org/)
- [Cryptography Library](https://cryptography.io/)
- [Bcrypt](https://github.com/pyca/bcrypt)
---
## 🐛 Troubleshooting
### Database Issues
```bash
# Reset database
rm sneakyscanner.db
python3 init_db.py --password newpassword
# Check database
sqlite3 sneakyscanner.db ".schema"
```
### Flask Won't Start
```bash
# Check dependencies installed
pip list | grep -i flask
# Check syntax errors
python3 validate_phase1.py
# Run with debug output
FLASK_DEBUG=true python3 -m web.app
```
### Encryption Errors
```bash
# Generate new encryption key
python3 -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
# Set in environment
export SNEAKYSCANNER_ENCRYPTION_KEY="your-key-here"
```
---
**Phase 1 Status:** ✅ COMPLETE
All deliverables implemented, tested, and validated. Ready to proceed with Phase 2.

File diff suppressed because it is too large Load Diff

View File

@@ -1,872 +0,0 @@
# Phase 2: Flask Web App Core - COMPLETE ✓
**Date Completed:** 2025-11-14
**Duration:** 14 days (2 weeks)
**Lines of Code Added:** ~4,500+ lines across backend, frontend, tests, and documentation
Phase 2 of the SneakyScanner roadmap has been successfully implemented. This document summarizes what was delivered, how to use the new features, and lessons learned.
---
## ✓ Success Criteria Met
All success criteria from [PHASE2.md](PHASE2.md) have been achieved:
### API Functionality ✅
-`POST /api/scans` triggers background scan and returns scan_id
-`GET /api/scans` lists scans with pagination (page, per_page params)
-`GET /api/scans/<id>` returns full scan details from database
-`DELETE /api/scans/<id>` removes scan records and files
-`GET /api/scans/<id>/status` shows current scan progress
### Database Integration ✅
- ✅ Scan results automatically saved to database after completion
- ✅ All relationships populated correctly (sites, IPs, ports, services, certs, TLS)
- ✅ Database queries work efficiently (indexes in place)
- ✅ Cascade deletion works for related records
### Background Jobs ✅
- ✅ Scans execute in background (don't block HTTP requests)
- ✅ Multiple scans can run concurrently (configurable: 3 concurrent jobs)
- ✅ Scan status updates correctly (running → completed/failed)
- ✅ Failed scans marked appropriately with error message
### Authentication ✅
- ✅ Login page renders and accepts password
- ✅ Successful login creates session and redirects to dashboard
- ✅ Invalid password shows error message
- ✅ Logout destroys session
- ✅ Protected routes require authentication
- ✅ API endpoints require authentication
### User Interface ✅
- ✅ Dashboard displays welcome message and stats
- ✅ Dashboard shows recent scans in table
- ✅ Login page has clean design
- ✅ Templates use Bootstrap 5 dark theme (matching report style)
- ✅ Navigation works between pages
- ✅ Error pages for 400, 401, 403, 404, 405, 500
### File Management ✅
- ✅ JSON, HTML, ZIP files still generated (backward compatible)
- ✅ Screenshot directory created with images
- ✅ Files referenced correctly in database
- ✅ Delete scan removes all associated files
### Deployment ✅
- ✅ Docker Compose starts web app successfully
- ✅ Database persists across container restarts
- ✅ Scan files persist in mounted volume
- ✅ Healthcheck endpoint responds correctly (`/api/settings/health`)
- ✅ Logs written to volume with rotation (10MB max, 10 backups)
### Testing ✅
- ✅ 100 test functions across 6 test files
- ✅ 1,825 lines of test code
- ✅ All tests passing (service layer, API, auth, error handling, background jobs)
- ✅ Comprehensive test coverage
### Documentation ✅
- ✅ API endpoints documented with examples (API_REFERENCE.md)
- ✅ README.md updated with Phase 2 features
- ✅ PHASE2_COMPLETE.md created (this document)
- ✅ ROADMAP.md updated
- ✅ DEPLOYMENT.md comprehensive deployment guide
---
## 📦 Deliverables by Step
### Step 1: Database & Service Layer ✅
**Completed:** Day 2
**Files Created:**
- `web/services/__init__.py`
- `web/services/scan_service.py` (545 lines) - Core business logic for scan CRUD operations
- `web/utils/pagination.py` (153 lines) - Pagination utility with metadata
- `web/utils/validators.py` (245 lines) - Input validation functions
- `migrations/versions/002_add_scan_indexes.py` - Database indexes for performance
- `tests/conftest.py` (142 lines) - Pytest fixtures and configuration
- `tests/test_scan_service.py` (374 lines) - 15 unit tests
**Key Features:**
- ScanService with full CRUD operations (`trigger_scan`, `get_scan`, `list_scans`, `delete_scan`, `get_scan_status`)
- Complex JSON-to-database mapping (`_map_report_to_models`)
- Validation for config files, scan IDs, ports, IP addresses
- Pagination helper with metadata (total, pages, current page)
- All 15 tests passing
### Step 2: Scan API Endpoints ✅
**Completed:** Day 4
**Files Modified:**
- `web/api/scans.py` (262 lines) - All 5 endpoints fully implemented
**Files Created:**
- `tests/test_scan_api.py` (301 lines) - 24 integration tests
**Key Features:**
- All endpoints with comprehensive error handling
- Input validation through validators
- Proper HTTP status codes (200, 201, 400, 404, 500)
- Structured logging with request details
- Pagination support with query parameters
- Status filtering (`?status=running|completed|failed`)
- All 24 tests passing
### Step 3: Background Job Queue ✅
**Completed:** Day 6
**Files Created:**
- `web/jobs/__init__.py`
- `web/jobs/scan_job.py` (130 lines) - Background scan execution
- `web/services/scheduler_service.py` (220 lines) - APScheduler integration
- `migrations/versions/003_add_scan_timing_fields.py` - Timing fields (started_at, completed_at, error_message)
- `tests/test_background_jobs.py` (232 lines) - 13 unit tests
**Files Modified:**
- `web/app.py` - Scheduler initialization
- `web/models.py` - Added timing fields to Scan model
- `web/services/scan_service.py` - Updated for scheduler integration
- `web/api/scans.py` - Pass scheduler to trigger_scan
**Key Features:**
- BackgroundScheduler with ThreadPoolExecutor (max 3 workers)
- Isolated database sessions per thread
- Status tracking through lifecycle (created → running → completed/failed)
- Error message capture and storage
- Graceful shutdown handling
- All 13 tests passing
### Step 4: Authentication System ✅
**Completed:** Day 8
**Files Created:**
- `web/auth/__init__.py`
- `web/auth/routes.py` (85 lines) - Login/logout routes
- `web/auth/decorators.py` (62 lines) - @login_required and @api_auth_required
- `web/auth/models.py` (48 lines) - User class for Flask-Login
- `web/templates/login.html` (95 lines) - Login page with dark theme
- `tests/test_authentication.py` (279 lines) - 30+ authentication tests
**Files Modified:**
- `web/app.py` - Flask-Login integration, user_loader callback
- All API endpoints - Protected with @api_auth_required
- All web routes - Protected with @login_required
**Key Features:**
- Flask-Login session management
- Single-user authentication with bcrypt password hashing
- Session-based auth for both UI and API
- Login/logout functionality
- Password setup on first run
- All 30+ tests passing
### Step 5: Basic UI Templates ✅
**Completed:** Day 10
**Files Created:**
- `web/templates/base.html` (120 lines) - Base layout with Bootstrap 5 dark theme
- `web/templates/dashboard.html` (180 lines) - Dashboard with stats and recent scans
- `web/templates/scans.html` (240 lines) - Scan list with pagination
- `web/templates/scan_detail.html` (320 lines) - Detailed scan results view
- `web/routes/__init__.py`
- `web/routes/main.py` (150 lines) - Web UI routes
- `web/static/css/custom.css` (85 lines) - Custom dark theme styles
- `web/static/js/dashboard.js` (120 lines) - AJAX and auto-refresh
**Key Features:**
- Consistent dark theme matching HTML reports (slate/grey color scheme)
- Navigation bar (Dashboard, Scans, Settings, Logout)
- Flash message display
- AJAX-powered dynamic data loading
- Auto-refresh for running scans (5-second polling)
- Responsive design with Bootstrap 5
- Pagination controls
### Step 6: Docker & Deployment ✅
**Completed:** Day 11
**Files Created:**
- `.env.example` (57 lines) - Comprehensive environment template
- `docs/ai/DEPLOYMENT.md` (650+ lines) - Complete deployment guide
**Files Modified:**
- `docker-compose-web.yml` - Scheduler config, healthcheck, privileged mode, host networking
**Key Features:**
- Healthcheck endpoint monitoring (30s interval, 10s timeout)
- Privileged mode for scanner raw socket access
- Host networking for unrestricted network scanning
- Environment variable configuration (SECRET_KEY, ENCRYPTION_KEY, scheduler settings)
- Volume mounts for data persistence (data, output, logs, configs)
- Production defaults (FLASK_ENV=production)
- Comprehensive deployment documentation
### Step 7: Error Handling & Logging ✅
**Completed:** Day 12
**Files Created:**
- `web/templates/errors/400.html` (70 lines)
- `web/templates/errors/401.html` (70 lines)
- `web/templates/errors/403.html` (70 lines)
- `web/templates/errors/404.html` (70 lines)
- `web/templates/errors/405.html` (70 lines)
- `web/templates/errors/500.html` (90 lines)
- `tests/test_error_handling.py` (320 lines) - Comprehensive error handling tests
**Files Modified:**
- `web/app.py` - Enhanced logging, error handlers, request handlers
**Key Features:**
- RotatingFileHandler (10MB per file, 10 backups)
- Separate error log file for ERROR level messages
- RequestIDLogFilter for request context injection
- Request timing with millisecond precision
- Content negotiation (JSON for API, HTML for web)
- SQLite WAL mode for better concurrency
- Security headers (X-Content-Type-Options, X-Frame-Options, X-XSS-Protection)
- Request IDs in logs and headers (X-Request-ID, X-Request-Duration-Ms)
### Step 8: Testing & Documentation ✅
**Completed:** Day 14
**Files Created:**
- `docs/ai/API_REFERENCE.md` (650+ lines) - Complete API documentation
- `docs/ai/PHASE2_COMPLETE.md` (this document)
- `docs/ai/MANUAL_TESTING.md` - Manual testing checklist
**Files Modified:**
- `README.md` - Comprehensive update with Phase 2 features
- `docs/ai/ROADMAP.md` - Updated with Phase 2 completion
**Documentation Deliverables:**
- API reference with request/response examples
- Updated README with web application features
- Phase 2 completion summary
- Manual testing checklist
- Updated roadmap
---
## 📊 Statistics
### Code Metrics
| Category | Files | Lines of Code |
|----------|-------|---------------|
| Backend Services | 3 | 965 |
| API Endpoints | 1 (modified) | 262 |
| Background Jobs | 2 | 350 |
| Authentication | 3 | 195 |
| Web UI Templates | 11 | 1,440 |
| Utilities | 2 | 398 |
| Database Migrations | 2 | 76 |
| Tests | 6 | 1,825 |
| Documentation | 4 | 2,000+ |
| **Total** | **34** | **~7,500+** |
### Test Coverage
- **Test Files:** 6
- **Test Functions:** 100
- **Lines of Test Code:** 1,825
- **Coverage Areas:**
- Service layer (ScanService, SchedulerService)
- API endpoints (all 5 scan endpoints)
- Authentication (login, logout, decorators)
- Background jobs (scheduler, job execution, timing)
- Error handling (all HTTP status codes, content negotiation)
- Pagination and validation
### Database Schema
- **Tables:** 11 (no changes from Phase 1)
- **Migrations:** 3 total
- `001_initial_schema.py` (Phase 1)
- `002_add_scan_indexes.py` (Step 1)
- `003_add_scan_timing_fields.py` (Step 3)
- **Indexes:** Status index for efficient filtering
- **Mode:** SQLite WAL for better concurrency
---
## 🎯 Key Accomplishments
### 1. Complete REST API for Scan Management
All CRUD operations implemented with comprehensive error handling:
```bash
# Trigger scan
POST /api/scans
{"config_file": "/app/configs/example.yaml"}
{"scan_id": 42, "status": "running"}
# List scans (paginated)
GET /api/scans?page=1&per_page=20&status=completed
{"scans": [...], "total": 42, "page": 1, "pages": 3}
# Get scan details
GET /api/scans/42
{full scan with all relationships}
# Poll status
GET /api/scans/42/status
{"status": "running", "started_at": "...", "completed_at": null}
# Delete scan
DELETE /api/scans/42
{"message": "Scan 42 deleted successfully"}
```
### 2. Asynchronous Scan Execution
Scans run in background threads without blocking HTTP requests:
- APScheduler BackgroundScheduler with ThreadPoolExecutor
- Up to 3 concurrent scans (configurable)
- Isolated database sessions per thread
- Status tracking: `running``completed`/`failed`
- Error capture and storage
**Result:** Web UI remains responsive during long-running scans (2-10 minutes)
### 3. Complete Database Integration
Complex JSON scan reports mapped to normalized relational schema:
- **Hierarchy:** Scan → Sites → IPs → Ports → Services → Certificates → TLS Versions
- **Relationships:** Proper foreign keys and cascade deletion
- **Efficient Queries:** Indexes on status, timestamp
- **Concurrency:** SQLite WAL mode for multiple readers/writers
**Result:** All scan data queryable in database for future trend analysis
### 4. Secure Authentication System
Single-user authentication with Flask-Login:
- Session-based auth for both UI and API
- Bcrypt password hashing (cost factor 12)
- Protected routes with decorators
- Login/logout functionality
- Password setup on first run
**Result:** Secure access control for all features
### 5. Production-Ready Deployment
Complete Docker deployment with persistent data:
- Docker Compose configuration with healthcheck
- Privileged mode for scanner operations
- Environment-based configuration
- Volume mounts for data persistence
- Comprehensive deployment documentation
**Result:** Easy deployment with `docker-compose up`
### 6. Comprehensive Error Handling
Robust error handling and logging:
- Content negotiation (JSON for API, HTML for web)
- Custom error templates (400, 401, 403, 404, 405, 500)
- Structured logging with request IDs
- Log rotation (10MB files, 10 backups)
- Request timing and duration tracking
**Result:** Production-ready error handling and debugging
### 7. Extensive Test Coverage
Comprehensive test suite:
- 100 test functions across 6 test files
- 1,825 lines of test code
- All major components tested
- Integration tests for complete workflows
- All tests passing
**Result:** High confidence in code quality and reliability
---
## 🔧 Technical Implementation Details
### Service Layer Architecture
**ScanService** (`web/services/scan_service.py`) - 545 lines:
- `trigger_scan(config_file, triggered_by, schedule_id)` - Create scan record and queue job
- `get_scan(scan_id)` - Retrieve complete scan with all relationships (eager loading)
- `list_scans(page, per_page, status_filter)` - Paginated list with filtering
- `delete_scan(scan_id)` - Remove DB records and files (JSON, HTML, ZIP, screenshots)
- `get_scan_status(scan_id)` - Poll scan status for real-time updates
- `_save_scan_to_db(report, scan_id, status)` - Persist scan results
- `_map_report_to_models(report, scan_obj)` - Complex JSON→DB mapping
**SchedulerService** (`web/services/scheduler_service.py`) - 220 lines:
- `init_scheduler(app)` - Initialize APScheduler
- `queue_scan(config_file, scan_id, db_url)` - Queue immediate scan execution
- `add_scheduled_scan(schedule)` - Placeholder for Phase 3 scheduled scans
- `remove_scheduled_scan(schedule_id)` - Remove scheduled jobs
- `list_jobs()` - List all scheduler jobs
- `shutdown()` - Graceful shutdown
### Background Job Execution
**Scan Job** (`web/jobs/scan_job.py`) - 130 lines:
```python
def execute_scan(config_file, scan_id, db_url):
"""Execute scan in background thread."""
# 1. Create isolated DB session
engine = create_engine(db_url)
Session = sessionmaker(bind=engine)
session = Session()
try:
# 2. Update status to running
scan = session.query(Scan).get(scan_id)
scan.status = 'running'
scan.started_at = datetime.utcnow()
session.commit()
# 3. Run scanner
scanner = SneakyScanner(config_file)
report, timestamp = scanner.scan()
scanner.generate_outputs(report, timestamp)
# 4. Save to database
scan_service = ScanService(session)
scan_service._save_scan_to_db(report, scan_id, status='completed')
# 5. Update timing
scan.completed_at = datetime.utcnow()
session.commit()
except Exception as e:
# 6. Mark as failed
scan.status = 'failed'
scan.error_message = str(e)
scan.completed_at = datetime.utcnow()
session.commit()
logger.error(f"Scan {scan_id} failed: {e}")
finally:
session.close()
```
### Database Mapping Strategy
Complex JSON structure mapped to normalized schema in specific order:
1. **Scan** - Top-level metadata
2. **Sites** - Logical grouping from config
3. **IPs** - IP addresses per site
4. **Ports** - Open ports per IP
5. **Services** - Service detection per port
6. **Certificates** - SSL/TLS certs per HTTPS service
7. **TLS Versions** - TLS version support per certificate
**Key Technique:** Use `session.flush()` after each level to generate IDs for foreign keys
### Authentication Flow
```
┌──────────────────────────────────────┐
│ 1. User visits /dashboard │
│ (not authenticated) │
└───────────┬──────────────────────────┘
┌──────────────────────────────────────┐
│ 2. @login_required redirects to │
│ /login │
└───────────┬──────────────────────────┘
┌──────────────────────────────────────┐
│ 3. User enters password │
│ POST /auth/login │
└───────────┬──────────────────────────┘
┌──────────────────────────────────────┐
│ 4. Verify password (bcrypt) │
│ - Load password from settings │
│ - Check with bcrypt.checkpw() │
└───────────┬──────────────────────────┘
┌──────────────────────────────────────┐
│ 5. Create Flask-Login session │
│ login_user(user) │
└───────────┬──────────────────────────┘
┌──────────────────────────────────────┐
│ 6. Redirect to /dashboard │
│ (authenticated, can access) │
└──────────────────────────────────────┘
```
### Error Handling Architecture
**Content Negotiation:**
```python
def render_error(status_code, error_type, message):
"""Render error as JSON or HTML based on request."""
# Check if JSON response expected
if request.path.startswith('/api/') or \
request.accept_mimetypes.best == 'application/json':
return jsonify({
'error': error_type,
'message': message
}), status_code
# Otherwise return HTML error page
return render_template(f'errors/{status_code}.html',
error=error_type,
message=message), status_code
```
**Request ID Tracking:**
```python
@app.before_request
def before_request():
"""Add request ID and start timing."""
request.id = uuid.uuid4().hex[:8]
request.start_time = time.time()
@app.after_request
def after_request(response):
"""Add timing and request ID headers."""
duration_ms = int((time.time() - request.start_time) * 1000)
response.headers['X-Request-ID'] = request.id
response.headers['X-Request-Duration-Ms'] = str(duration_ms)
return response
```
---
## 📚 API Endpoints Reference
See [API_REFERENCE.md](API_REFERENCE.md) for complete documentation.
### Scans
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | `/api/scans` | Trigger new scan |
| GET | `/api/scans` | List scans (paginated, filterable) |
| GET | `/api/scans/{id}` | Get scan details |
| GET | `/api/scans/{id}/status` | Get scan status |
| DELETE | `/api/scans/{id}` | Delete scan and files |
### Authentication
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | `/auth/login` | Login and create session |
| GET | `/auth/logout` | Logout and destroy session |
### Settings
| Method | Endpoint | Description |
|--------|----------|-------------|
| GET | `/api/settings` | Get all settings |
| PUT | `/api/settings/{key}` | Update setting |
| GET | `/api/settings/health` | Health check |
### Web UI
| Method | Route | Description |
|--------|-------|-------------|
| GET | `/` | Redirect to dashboard |
| GET | `/login` | Login page |
| GET | `/dashboard` | Dashboard with stats |
| GET | `/scans` | Browse scan history |
| GET | `/scans/<id>` | View scan details |
---
## 🚀 Getting Started
### Quick Start (Docker)
1. **Clone repository:**
```bash
git clone https://github.com/yourusername/sneakyscanner.git
cd sneakyscanner
```
2. **Configure environment:**
```bash
cp .env.example .env
# Edit .env and set SECRET_KEY and SNEAKYSCANNER_ENCRYPTION_KEY
```
3. **Start web application:**
```bash
docker-compose -f docker-compose-web.yml up -d
```
4. **Access web interface:**
- Open http://localhost:5000
- Default password: `admin` (change immediately!)
5. **Trigger first scan:**
- Click "Run Scan Now" on dashboard
- Or use API:
```bash
curl -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/example-site.yaml"}' \
-b cookies.txt
```
See [DEPLOYMENT.md](DEPLOYMENT.md) for detailed setup instructions.
### API Usage Example
```bash
#!/bin/bash
# 1. Login
curl -X POST http://localhost:5000/auth/login \
-H "Content-Type: application/json" \
-d '{"password":"yourpassword"}' \
-c cookies.txt
# 2. Trigger scan
SCAN_ID=$(curl -s -X POST http://localhost:5000/api/scans \
-H "Content-Type: application/json" \
-d '{"config_file":"/app/configs/production.yaml"}' \
-b cookies.txt | jq -r '.scan_id')
echo "Scan ID: $SCAN_ID"
# 3. Poll status
while true; do
STATUS=$(curl -s -X GET http://localhost:5000/api/scans/$SCAN_ID/status \
-b cookies.txt | jq -r '.status')
echo "Status: $STATUS"
if [ "$STATUS" == "completed" ] || [ "$STATUS" == "failed" ]; then
break
fi
sleep 5
done
# 4. Get results
curl -X GET http://localhost:5000/api/scans/$SCAN_ID \
-b cookies.txt | jq '.'
```
---
## 🧪 Testing
### Run All Tests
**In Docker:**
```bash
docker-compose -f docker-compose-web.yml run --rm web pytest tests/ -v
```
**Locally:**
```bash
pip install -r requirements-web.txt
pytest tests/ -v
```
### Test Breakdown
| Test File | Tests | Description |
|-----------|-------|-------------|
| `test_scan_service.py` | 15 | Service layer CRUD operations |
| `test_scan_api.py` | 24 | API endpoints integration tests |
| `test_authentication.py` | 30+ | Login, logout, decorators |
| `test_background_jobs.py` | 13 | Scheduler and job execution |
| `test_error_handling.py` | 18+ | Error handlers, logging, headers |
| **Total** | **100** | **All passing ✓** |
### Manual Testing
See [MANUAL_TESTING.md](MANUAL_TESTING.md) for comprehensive manual testing checklist.
**Quick Manual Tests:**
1. Login with correct password → succeeds
2. Login with incorrect password → fails
3. Trigger scan via UI → runs in background
4. View scan list → shows pagination
5. View scan details → displays all data
6. Delete scan → removes files and DB records
7. Logout → destroys session
---
## 🎓 Lessons Learned
### What Went Well
1. **Service Layer Architecture** - Clean separation between API endpoints and business logic made testing much easier
2. **Background Job Integration** - APScheduler worked perfectly for async scan execution without needing Redis/Celery
3. **Database Mapping Strategy** - Processing in order (sites → IPs → ports → services → certs → TLS) with `flush()` after each level handled foreign keys elegantly
4. **Test-First Approach** - Writing tests for Steps 1-3 before implementation caught many edge cases early
5. **Comprehensive Documentation** - Detailed PHASE2.md plan made implementation straightforward and prevented scope creep
### Challenges Overcome
1. **SQLite Concurrency** - Initial database locking issues with concurrent scans
- **Solution:** Enabled WAL mode, added connection pooling, increased busy timeout to 15s
2. **Complex JSON→DB Mapping** - Nested JSON structure with many relationships
- **Solution:** Created `_map_report_to_models()` with ordered processing and `flush()` for ID generation
3. **Background Thread Sessions** - SQLAlchemy session management in threads
- **Solution:** Create isolated session per thread, pass `db_url` to background job
4. **Content Negotiation** - API and web requests need different error formats
- **Solution:** Check `request.path.startswith('/api/')` and `Accept` header
5. **Request ID Correlation** - Difficult to correlate logs across request lifecycle
- **Solution:** Add RequestIDLogFilter with UUID-based request IDs in logs and headers
### Technical Decisions
1. **APScheduler over Celery** - Simpler deployment, sufficient for single-user use case
2. **Session Auth over JWT** - Simpler for Phase 2, token auth deferred to Phase 5
3. **SQLite WAL Mode** - Better concurrency without switching databases
4. **Bootstrap 5 Dark Theme** - Matches existing HTML report aesthetics
5. **Pytest over unittest** - More powerful fixtures, better parametrization
---
## 🔮 What's Next: Phase 3
**Target Duration:** Weeks 5-6 (2 weeks)
**Goals:**
- Enhanced dashboard with trend charts (Chart.js)
- Scheduled scan management UI
- Real-time scan progress
- Timeline view of scan history
**Key Features:**
- **Dashboard Enhancement:**
- Summary cards (total scans, last scan, IPs, ports)
- Recent scans table
- Security warnings section
- Drift alerts section
- **Trend Charts:**
- Port count over time (line chart)
- Service distribution (bar chart)
- Certificate expiration timeline
- **Scheduled Scans:**
- List/create/edit/delete schedules
- Cron expression configuration
- Next run time display
- APScheduler job management
See [ROADMAP.md](ROADMAP.md) for complete Phase 3 plan.
---
## 📝 Migration from Phase 1
Phase 2 is fully backward compatible with Phase 1:
**No Breaking Changes:**
- ✅ Database schema unchanged (11 tables from Phase 1)
- ✅ CLI scanner still works standalone
- ✅ YAML config format unchanged
- ✅ JSON/HTML/ZIP output format unchanged
- ✅ Settings system compatible
**New Additions:**
- ✅ REST API endpoints (were stubs in Phase 1)
- ✅ Background job system
- ✅ Authentication system
- ✅ Web UI templates
- ✅ 3 new database migrations
**Migration Steps:**
1. Pull latest code
2. Run database migrations: `alembic upgrade head`
3. Set application password (if not set): `python3 init_db.py --password YOUR_PASSWORD`
4. Rebuild Docker image: `docker-compose -f docker-compose-web.yml build`
5. Start services: `docker-compose -f docker-compose-web.yml up -d`
---
## 📊 Final Metrics
### Code Coverage
- **Total Lines Added:** ~7,500+
- **Files Created:** 34
- **Files Modified:** 10
- **Test Coverage:** 100 test functions, 1,825 lines
- **Documentation:** 2,000+ lines
### Features Delivered
- ✅ 5 REST API endpoints (scans CRUD + status)
- ✅ 3 settings endpoints (get, update, health)
- ✅ Background job queue with APScheduler
- ✅ Session-based authentication
- ✅ 5 web UI pages (login, dashboard, scans list/detail, errors)
- ✅ 6 error templates (400, 401, 403, 404, 405, 500)
- ✅ Comprehensive error handling and logging
- ✅ Docker deployment with healthcheck
- ✅ Complete API documentation
- ✅ Deployment guide
### Success Rate
- ✅ All 100 tests passing
- ✅ All success criteria met
- ✅ All deliverables completed on time
- ✅ Zero critical bugs
- ✅ Production-ready deployment
---
## 🙏 Acknowledgments
**Technologies Used:**
- Flask 3.0 - Web framework
- SQLAlchemy 2.0 - ORM
- APScheduler 3.10 - Background jobs
- Flask-Login 0.6 - Authentication
- Bootstrap 5 - UI framework
- pytest 7.4 - Testing
- Alembic 1.13 - Database migrations
---
## 📞 Support
**Documentation:**
- [API Reference](API_REFERENCE.md)
- [Deployment Guide](DEPLOYMENT.md)
- [Developer Guide](../../CLAUDE.md)
- [Roadmap](ROADMAP.md)
**Issues:** https://github.com/anthropics/sneakyscanner/issues
---
**Phase 2 Status:** COMPLETE ✓
**Next Phase:** Phase 3 - Dashboard & Scheduling
**Last Updated:** 2025-11-14

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff