Compare commits
61 Commits
b2a3fc7832
...
nightly
| Author | SHA1 | Date | |
|---|---|---|---|
| 4b197e0b3d | |||
| 30f0987a99 | |||
| 9e2fc348b7 | |||
| 847e05abbe | |||
| 07c2bcfd11 | |||
| a560bae800 | |||
| 56828e4184 | |||
| 5e3a70f837 | |||
| 451c7e92ff | |||
| 8b89fd506d | |||
| f24bd11dfd | |||
| 9bd2f67150 | |||
| 3058c69c39 | |||
| 04dc238aea | |||
| c592000c96 | |||
| 4c6b4bf35d | |||
| 3adb51ece2 | |||
| c4cbbee280 | |||
| 889e1eaac3 | |||
| a682e5233c | |||
| 7a14f1602b | |||
| 949bccf644 | |||
| 801ddc8d81 | |||
| db5c828b5f | |||
| a044c19a46 | |||
| a5e2b43944 | |||
| 3219f8a861 | |||
| 480065ed14 | |||
| 73a3b95834 | |||
| 8d8e53c903 | |||
| 12d5aff7a5 | |||
| cc3758f92d | |||
| 9804f9c032 | |||
| e3b647521e | |||
| 7460c9e23e | |||
| 66b02edc84 | |||
| f8b89c46c2 | |||
| 6d5005403c | |||
| 05f846809e | |||
| 7c26824aa1 | |||
| 91507cc8f8 | |||
| 7437716613 | |||
| 657f4784bf | |||
| 73d04cae5e | |||
| b8c3e4e2d8 | |||
| aa7c32381c | |||
| 0fc51eb032 | |||
| fdf689316f | |||
| 41ba4c47b5 | |||
| b2e6efb4b3 | |||
| e7dd207a62 | |||
| 30a29142a0 | |||
| 0ec338e252 | |||
| 034f146fa1 | |||
| 4a4c33a10b | |||
| 21254c3522 | |||
| 230094d7b2 | |||
| 28b32a2049 | |||
| 1d076a467a | |||
| 3c740268c4 | |||
| 131e1f5a61 |
30
.env.example
30
.env.example
File diff suppressed because one or more lines are too long
5
.gitignore
vendored
5
.gitignore
vendored
@@ -9,6 +9,11 @@ output/
|
||||
data/
|
||||
logs/
|
||||
|
||||
# Environment and secrets
|
||||
.env
|
||||
admin_password.txt
|
||||
logs/admin_password.txt
|
||||
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
|
||||
@@ -39,12 +39,13 @@ COPY app/web/ ./web/
|
||||
COPY app/migrations/ ./migrations/
|
||||
COPY app/alembic.ini .
|
||||
COPY app/init_db.py .
|
||||
COPY app/docker-entrypoint.sh /docker-entrypoint.sh
|
||||
|
||||
# Create required directories
|
||||
RUN mkdir -p /app/output /app/logs
|
||||
|
||||
# Make scripts executable
|
||||
RUN chmod +x /app/src/scanner.py /app/init_db.py
|
||||
RUN chmod +x /app/src/scanner.py /app/init_db.py /docker-entrypoint.sh
|
||||
|
||||
# Force Python unbuffered output
|
||||
ENV PYTHONUNBUFFERED=1
|
||||
|
||||
88
README.md
88
README.md
@@ -3,7 +3,7 @@
|
||||
A comprehensive network scanning and infrastructure monitoring platform with web interface and CLI scanner. SneakyScanner uses masscan for fast port discovery, nmap for service detection, sslyze for SSL/TLS analysis, and Playwright for webpage screenshots to perform comprehensive infrastructure audits.
|
||||
|
||||
**Primary Interface**: Web Application (Flask-based GUI)
|
||||
**Alternative**: Standalone CLI Scanner (for testing and CI/CD)
|
||||
**Scripting/Automation**: REST API (see [API Reference](docs/API_REFERENCE.md))
|
||||
|
||||
---
|
||||
|
||||
@@ -12,7 +12,7 @@ A comprehensive network scanning and infrastructure monitoring platform with web
|
||||
- 🌐 **Web Dashboard** - Modern web UI for scan management, scheduling, and historical analysis
|
||||
- 📊 **Database Storage** - SQLite-based scan history with trend analysis and comparison
|
||||
- ⏰ **Scheduled Scans** - Cron-based automated scanning with APScheduler
|
||||
- 🔧 **Config Creator** - CIDR-to-YAML configuration builder for quick setup
|
||||
- 🔧 **Config Creator** - Web-based target configuration builder for quick setup
|
||||
- 🔍 **Network Discovery** - Fast port scanning with masscan (all 65535 ports, TCP/UDP)
|
||||
- 🎯 **Service Detection** - Nmap-based service enumeration with version detection
|
||||
- 🔒 **SSL/TLS Analysis** - Certificate extraction, TLS version testing, cipher suite analysis
|
||||
@@ -20,13 +20,36 @@ A comprehensive network scanning and infrastructure monitoring platform with web
|
||||
- 📈 **Drift Detection** - Expected vs. actual infrastructure comparison
|
||||
- 📋 **Multi-Format Reports** - JSON, HTML, and ZIP archives with visual reports
|
||||
- 🔐 **Authentication** - Session-based login for single-user deployments
|
||||
- 🔔 **Alerts** *(Phase 5 - Coming Soon)* - Email and webhook notifications for misconfigurations
|
||||
- 🔔 **Webhook Alerts** - Real-time notifications via Slack, Discord, PagerDuty, and custom integrations
|
||||
- ⚠️ **Alert Rules** - Automated detection of infrastructure misconfigurations and anomalies
|
||||
|
||||
---
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Web Application (Recommended)
|
||||
### Web Application
|
||||
|
||||
**Easy Setup (One Command):**
|
||||
|
||||
```bash
|
||||
# 1. Clone repository
|
||||
git clone <repository-url>
|
||||
cd SneakyScan
|
||||
|
||||
# 2. Run setup script
|
||||
./setup.sh
|
||||
|
||||
# 3. Access web interface at http://localhost:5000
|
||||
```
|
||||
|
||||
The setup script will:
|
||||
- Generate secure keys automatically
|
||||
- Create required directories
|
||||
- Build and start the Docker containers
|
||||
- Initialize the database on first run
|
||||
- Display your login credentials
|
||||
|
||||
**Manual Setup (Alternative):**
|
||||
|
||||
```bash
|
||||
# 1. Clone repository
|
||||
@@ -35,43 +58,24 @@ cd SneakyScan
|
||||
|
||||
# 2. Configure environment
|
||||
cp .env.example .env
|
||||
# Edit .env and set SECRET_KEY and SNEAKYSCANNER_ENCRYPTION_KEY
|
||||
# Edit .env and set SECRET_KEY, SNEAKYSCANNER_ENCRYPTION_KEY, and INITIAL_PASSWORD
|
||||
|
||||
# 3. Build and start
|
||||
docker compose build
|
||||
docker compose up -d
|
||||
# 3. Build and start (database auto-initializes on first run)
|
||||
docker compose up --build -d
|
||||
|
||||
# 4. Initialize database
|
||||
docker compose run --rm init-db --password "YourSecurePassword"
|
||||
|
||||
# 5. Access web interface
|
||||
# 4. Access web interface
|
||||
# Open http://localhost:5000
|
||||
```
|
||||
|
||||
**See [Deployment Guide](docs/DEPLOYMENT.md) for detailed setup instructions.**
|
||||
|
||||
### CLI Scanner (Standalone)
|
||||
|
||||
For quick one-off scans without the web interface:
|
||||
|
||||
```bash
|
||||
# Build and run
|
||||
docker compose -f docker-compose-standalone.yml build
|
||||
docker compose -f docker-compose-standalone.yml up
|
||||
|
||||
# Results saved to ./output/
|
||||
```
|
||||
|
||||
**See [CLI Scanning Guide](docs/CLI_SCANNING.md) for detailed usage.**
|
||||
|
||||
---
|
||||
|
||||
## Documentation
|
||||
|
||||
### User Guides
|
||||
- **[Deployment Guide](docs/DEPLOYMENT.md)** - Installation, configuration, and production deployment
|
||||
- **[CLI Scanning Guide](docs/CLI_SCANNING.md)** - Standalone scanner usage, configuration, and output formats
|
||||
- **[API Reference](docs/API_REFERENCE.md)** - Complete REST API documentation
|
||||
- **[API Reference](docs/API_REFERENCE.md)** - Complete REST API documentation for scripting and automation
|
||||
|
||||
### Developer Resources
|
||||
- **[Roadmap](docs/ROADMAP.md)** - Project roadmap, architecture, and planned features
|
||||
@@ -80,27 +84,28 @@ docker compose -f docker-compose-standalone.yml up
|
||||
|
||||
## Current Status
|
||||
|
||||
**Latest Version**: Phase 4 Complete ✅
|
||||
**Last Updated**: 2025-11-17
|
||||
**Latest Version**: Phase 5 Complete ✅
|
||||
**Last Updated**: 2025-11-19
|
||||
|
||||
### Completed Phases
|
||||
|
||||
- ✅ **Phase 1**: Database schema, SQLAlchemy models, settings system
|
||||
- ✅ **Phase 2**: REST API, background jobs, authentication, web UI
|
||||
- ✅ **Phase 3**: Dashboard, scheduling, trend charts
|
||||
- ✅ **Phase 4**: Config creator, YAML editor, config management UI
|
||||
- ✅ **Phase 4**: Config creator, target editor, config management UI
|
||||
- ✅ **Phase 5**: Webhooks & alerting, notification templates, alert rules
|
||||
|
||||
### Next Up: Phase 5 - Email, Webhooks & Comparisons
|
||||
### Next Up: Phase 6 - CLI as API Client
|
||||
|
||||
**Core Use Case**: Monitor infrastructure for misconfigurations that expose unexpected ports/services. When a scan detects an open port not in the config's `expected_ports` list, trigger immediate notifications.
|
||||
**Goal**: Create a thin CLI client that calls the Flask API for scan operations, enabling scripting and automation workflows while leveraging centralized database storage and web dashboard features.
|
||||
|
||||
**Planned Features**:
|
||||
- Email notifications for infrastructure changes
|
||||
- Webhook integrations (Slack, PagerDuty, custom SIEM)
|
||||
- Alert rule engine (unexpected ports, cert expiry, weak TLS)
|
||||
- Scan comparison reports for drift detection
|
||||
- API token authentication for CLI access
|
||||
- Remote scan triggering and status polling
|
||||
- Centralized scan history accessible via web dashboard
|
||||
- Scriptable automation workflows
|
||||
|
||||
See [Roadmap](docs/ROADMAP.md) for complete feature timeline.
|
||||
See [Roadmap](docs/ROADMAP.md) for complete feature timeline and future phases.
|
||||
|
||||
---
|
||||
|
||||
@@ -168,7 +173,7 @@ See [Deployment Guide](docs/DEPLOYMENT.md) for production security checklist.
|
||||
|
||||
## Contributing
|
||||
|
||||
This is a personal/small team project. For bugs or feature requests:
|
||||
This is a personal project. For bugs or feature requests:
|
||||
|
||||
1. Check existing issues
|
||||
2. Create detailed bug reports with reproduction steps
|
||||
@@ -186,7 +191,6 @@ MIT License - See LICENSE file for details
|
||||
|
||||
**Documentation**:
|
||||
- [Deployment Guide](docs/DEPLOYMENT.md)
|
||||
- [CLI Scanning Guide](docs/CLI_SCANNING.md)
|
||||
- [API Reference](docs/API_REFERENCE.md)
|
||||
- [Roadmap](docs/ROADMAP.md)
|
||||
|
||||
@@ -194,5 +198,5 @@ MIT License - See LICENSE file for details
|
||||
|
||||
---
|
||||
|
||||
**Version**: Phase 4 Complete
|
||||
**Last Updated**: 2025-11-17
|
||||
**Version**: 1.0.0-beta
|
||||
**Last Updated**: 2025-11-19
|
||||
|
||||
80
app/docker-entrypoint.sh
Normal file
80
app/docker-entrypoint.sh
Normal file
@@ -0,0 +1,80 @@
|
||||
#!/bin/bash
|
||||
set -e
|
||||
|
||||
# SneakyScanner Docker Entrypoint Script
|
||||
# This script ensures the database is initialized before starting the Flask app
|
||||
|
||||
DB_PATH="${DATABASE_URL#sqlite:///}" # Extract path from sqlite:////app/data/sneakyscanner.db
|
||||
DB_DIR=$(dirname "$DB_PATH")
|
||||
INIT_MARKER="$DB_DIR/.db_initialized"
|
||||
PASSWORD_FILE="/app/logs/admin_password.txt" # Save to logs dir (mounted, no permission issues)
|
||||
|
||||
echo "=== SneakyScanner Startup ==="
|
||||
echo "Database path: $DB_PATH"
|
||||
echo "Database directory: $DB_DIR"
|
||||
|
||||
# Ensure database directory exists
|
||||
mkdir -p "$DB_DIR"
|
||||
|
||||
# Check if this is the first run (database doesn't exist or not initialized)
|
||||
if [ ! -f "$DB_PATH" ] || [ ! -f "$INIT_MARKER" ]; then
|
||||
echo ""
|
||||
echo "=== First Run Detected ==="
|
||||
echo "Initializing database..."
|
||||
|
||||
# Set default password from environment or generate a random one
|
||||
if [ -z "$INITIAL_PASSWORD" ]; then
|
||||
echo "INITIAL_PASSWORD not set, generating random password..."
|
||||
# Generate a 32-character alphanumeric password
|
||||
INITIAL_PASSWORD=$(cat /dev/urandom | tr -dc 'A-Za-z0-9' | head -c 32)
|
||||
# Ensure logs directory exists
|
||||
mkdir -p /app/logs
|
||||
echo "$INITIAL_PASSWORD" > "$PASSWORD_FILE"
|
||||
echo "✓ Random password generated and saved to: ./logs/admin_password.txt"
|
||||
SAVE_PASSWORD_MESSAGE=true
|
||||
fi
|
||||
|
||||
# Run database initialization
|
||||
python3 /app/init_db.py \
|
||||
--db-url "$DATABASE_URL" \
|
||||
--password "$INITIAL_PASSWORD" \
|
||||
--no-migrations \
|
||||
--force
|
||||
|
||||
# Create marker file to indicate successful initialization
|
||||
if [ $? -eq 0 ]; then
|
||||
touch "$INIT_MARKER"
|
||||
echo "✓ Database initialized successfully"
|
||||
echo ""
|
||||
echo "=== IMPORTANT ==="
|
||||
if [ "$SAVE_PASSWORD_MESSAGE" = "true" ]; then
|
||||
echo "Login password saved to: ./logs/admin_password.txt"
|
||||
echo "Password: $INITIAL_PASSWORD"
|
||||
else
|
||||
echo "Login password: $INITIAL_PASSWORD"
|
||||
fi
|
||||
echo "Please change this password after logging in!"
|
||||
echo "=================="
|
||||
echo ""
|
||||
else
|
||||
echo "✗ Database initialization failed!"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "Database already initialized, skipping init..."
|
||||
fi
|
||||
|
||||
# Apply any pending migrations (if using migrations in future)
|
||||
if [ -f "/app/alembic.ini" ]; then
|
||||
echo "Checking for pending migrations..."
|
||||
# Uncomment when ready to use migrations:
|
||||
# alembic upgrade head
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "=== Starting Flask Application ==="
|
||||
echo "Flask will be available at http://localhost:5000"
|
||||
echo ""
|
||||
|
||||
# Execute the main application
|
||||
exec "$@"
|
||||
118
app/init_db.py
118
app/init_db.py
@@ -23,11 +23,112 @@ from alembic import command
|
||||
from alembic.config import Config
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from datetime import datetime, timezone
|
||||
|
||||
from web.models import Base
|
||||
from web.models import Base, AlertRule
|
||||
from web.utils.settings import PasswordManager, SettingsManager
|
||||
|
||||
|
||||
def init_default_alert_rules(session):
|
||||
"""
|
||||
Create default alert rules for Phase 5.
|
||||
|
||||
Args:
|
||||
session: Database session
|
||||
"""
|
||||
print("Initializing default alert rules...")
|
||||
|
||||
# Check if alert rules already exist
|
||||
existing_rules = session.query(AlertRule).count()
|
||||
if existing_rules > 0:
|
||||
print(f" Alert rules already exist ({existing_rules} rules), skipping...")
|
||||
return
|
||||
|
||||
default_rules = [
|
||||
{
|
||||
'name': 'Unexpected Port Detection',
|
||||
'rule_type': 'unexpected_port',
|
||||
'enabled': True,
|
||||
'threshold': None,
|
||||
'email_enabled': False,
|
||||
'webhook_enabled': False,
|
||||
'severity': 'warning',
|
||||
'filter_conditions': None,
|
||||
'config_id': None
|
||||
},
|
||||
{
|
||||
'name': 'Drift Detection',
|
||||
'rule_type': 'drift_detection',
|
||||
'enabled': True,
|
||||
'threshold': None, # No threshold means alert on any drift
|
||||
'email_enabled': False,
|
||||
'webhook_enabled': False,
|
||||
'severity': 'info',
|
||||
'filter_conditions': None,
|
||||
'config_id': None
|
||||
},
|
||||
{
|
||||
'name': 'Certificate Expiry Warning',
|
||||
'rule_type': 'cert_expiry',
|
||||
'enabled': True,
|
||||
'threshold': 30, # Alert when certs expire in 30 days
|
||||
'email_enabled': False,
|
||||
'webhook_enabled': False,
|
||||
'severity': 'warning',
|
||||
'filter_conditions': None,
|
||||
'config_id': None
|
||||
},
|
||||
{
|
||||
'name': 'Weak TLS Detection',
|
||||
'rule_type': 'weak_tls',
|
||||
'enabled': True,
|
||||
'threshold': None,
|
||||
'email_enabled': False,
|
||||
'webhook_enabled': False,
|
||||
'severity': 'warning',
|
||||
'filter_conditions': None,
|
||||
'config_id': None
|
||||
},
|
||||
{
|
||||
'name': 'Host Down Detection',
|
||||
'rule_type': 'ping_failed',
|
||||
'enabled': True,
|
||||
'threshold': None,
|
||||
'email_enabled': False,
|
||||
'webhook_enabled': False,
|
||||
'severity': 'critical',
|
||||
'filter_conditions': None,
|
||||
'config_id': None
|
||||
}
|
||||
]
|
||||
|
||||
try:
|
||||
for rule_data in default_rules:
|
||||
rule = AlertRule(
|
||||
name=rule_data['name'],
|
||||
rule_type=rule_data['rule_type'],
|
||||
enabled=rule_data['enabled'],
|
||||
threshold=rule_data['threshold'],
|
||||
email_enabled=rule_data['email_enabled'],
|
||||
webhook_enabled=rule_data['webhook_enabled'],
|
||||
severity=rule_data['severity'],
|
||||
filter_conditions=rule_data['filter_conditions'],
|
||||
config_id=rule_data['config_id'],
|
||||
created_at=datetime.now(timezone.utc),
|
||||
updated_at=datetime.now(timezone.utc)
|
||||
)
|
||||
session.add(rule)
|
||||
print(f" ✓ Created rule: {rule.name}")
|
||||
|
||||
session.commit()
|
||||
print(f"✓ Created {len(default_rules)} default alert rules")
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to create default alert rules: {e}")
|
||||
session.rollback()
|
||||
raise
|
||||
|
||||
|
||||
def init_database(db_url: str = "sqlite:///./sneakyscanner.db", run_migrations: bool = True):
|
||||
"""
|
||||
Initialize the database schema and settings.
|
||||
@@ -78,6 +179,10 @@ def init_database(db_url: str = "sqlite:///./sneakyscanner.db", run_migrations:
|
||||
settings_manager = SettingsManager(session)
|
||||
settings_manager.init_defaults()
|
||||
print("✓ Default settings initialized")
|
||||
|
||||
# Initialize default alert rules
|
||||
init_default_alert_rules(session)
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to initialize settings: {e}")
|
||||
session.rollback()
|
||||
@@ -164,6 +269,9 @@ Examples:
|
||||
# Use custom database URL
|
||||
python3 init_db.py --db-url postgresql://user:pass@localhost/sneakyscanner
|
||||
|
||||
# Force initialization without prompting (for Docker/scripts)
|
||||
python3 init_db.py --force --password mysecret
|
||||
|
||||
# Verify existing database
|
||||
python3 init_db.py --verify-only
|
||||
"""
|
||||
@@ -192,6 +300,12 @@ Examples:
|
||||
help='Create tables directly instead of using migrations'
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
'--force',
|
||||
action='store_true',
|
||||
help='Force initialization without prompting (for non-interactive environments)'
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Check if database already exists
|
||||
@@ -200,7 +314,7 @@ Examples:
|
||||
db_path = args.db_url.replace('sqlite:///', '')
|
||||
db_exists = Path(db_path).exists()
|
||||
|
||||
if db_exists and not args.verify_only:
|
||||
if db_exists and not args.verify_only and not args.force:
|
||||
response = input(f"\nDatabase already exists at {db_path}. Reinitialize? (y/N): ")
|
||||
if response.lower() != 'y':
|
||||
print("Aborting.")
|
||||
|
||||
120
app/migrations/versions/004_add_alert_rule_enhancements.py
Normal file
120
app/migrations/versions/004_add_alert_rule_enhancements.py
Normal file
@@ -0,0 +1,120 @@
|
||||
"""Add enhanced alert features for Phase 5
|
||||
|
||||
Revision ID: 004
|
||||
Revises: 003
|
||||
Create Date: 2025-11-18
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic
|
||||
revision = '004'
|
||||
down_revision = '003'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
"""
|
||||
Add enhancements for Phase 5 Alert Rule Engine:
|
||||
- Enhanced alert_rules fields
|
||||
- Enhanced alerts fields
|
||||
- New webhooks table
|
||||
- New webhook_delivery_log table
|
||||
"""
|
||||
|
||||
# Enhance alert_rules table
|
||||
with op.batch_alter_table('alert_rules') as batch_op:
|
||||
batch_op.add_column(sa.Column('name', sa.String(255), nullable=True, comment='User-friendly rule name'))
|
||||
batch_op.add_column(sa.Column('webhook_enabled', sa.Boolean(), nullable=False, server_default='0', comment='Whether to send webhooks for this rule'))
|
||||
batch_op.add_column(sa.Column('severity', sa.String(20), nullable=True, comment='Alert severity level (critical, warning, info)'))
|
||||
batch_op.add_column(sa.Column('filter_conditions', sa.Text(), nullable=True, comment='JSON filter conditions for the rule'))
|
||||
batch_op.add_column(sa.Column('config_file', sa.String(255), nullable=True, comment='Optional: specific config file this rule applies to'))
|
||||
batch_op.add_column(sa.Column('updated_at', sa.DateTime(), nullable=True, comment='Last update timestamp'))
|
||||
|
||||
# Enhance alerts table
|
||||
with op.batch_alter_table('alerts') as batch_op:
|
||||
batch_op.add_column(sa.Column('rule_id', sa.Integer(), nullable=True, comment='Associated alert rule'))
|
||||
batch_op.add_column(sa.Column('webhook_sent', sa.Boolean(), nullable=False, server_default='0', comment='Whether webhook was sent'))
|
||||
batch_op.add_column(sa.Column('webhook_sent_at', sa.DateTime(), nullable=True, comment='When webhook was sent'))
|
||||
batch_op.add_column(sa.Column('acknowledged', sa.Boolean(), nullable=False, server_default='0', comment='Whether alert was acknowledged'))
|
||||
batch_op.add_column(sa.Column('acknowledged_at', sa.DateTime(), nullable=True, comment='When alert was acknowledged'))
|
||||
batch_op.add_column(sa.Column('acknowledged_by', sa.String(255), nullable=True, comment='User who acknowledged the alert'))
|
||||
batch_op.create_foreign_key('fk_alerts_rule_id', 'alert_rules', ['rule_id'], ['id'])
|
||||
batch_op.create_index('idx_alerts_rule_id', ['rule_id'])
|
||||
batch_op.create_index('idx_alerts_acknowledged', ['acknowledged'])
|
||||
|
||||
# Create webhooks table
|
||||
op.create_table('webhooks',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('name', sa.String(255), nullable=False, comment='Webhook name'),
|
||||
sa.Column('url', sa.Text(), nullable=False, comment='Webhook URL'),
|
||||
sa.Column('enabled', sa.Boolean(), nullable=False, server_default='1', comment='Whether webhook is enabled'),
|
||||
sa.Column('auth_type', sa.String(20), nullable=True, comment='Authentication type: none, bearer, basic, custom'),
|
||||
sa.Column('auth_token', sa.Text(), nullable=True, comment='Encrypted authentication token'),
|
||||
sa.Column('custom_headers', sa.Text(), nullable=True, comment='JSON custom headers'),
|
||||
sa.Column('alert_types', sa.Text(), nullable=True, comment='JSON array of alert types to trigger on'),
|
||||
sa.Column('severity_filter', sa.Text(), nullable=True, comment='JSON array of severities to trigger on'),
|
||||
sa.Column('timeout', sa.Integer(), nullable=True, server_default='10', comment='Request timeout in seconds'),
|
||||
sa.Column('retry_count', sa.Integer(), nullable=True, server_default='3', comment='Number of retry attempts'),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False),
|
||||
sa.Column('updated_at', sa.DateTime(), nullable=False),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
# Create webhook_delivery_log table
|
||||
op.create_table('webhook_delivery_log',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('webhook_id', sa.Integer(), nullable=False, comment='Associated webhook'),
|
||||
sa.Column('alert_id', sa.Integer(), nullable=False, comment='Associated alert'),
|
||||
sa.Column('status', sa.String(20), nullable=True, comment='Delivery status: success, failed, retrying'),
|
||||
sa.Column('response_code', sa.Integer(), nullable=True, comment='HTTP response code'),
|
||||
sa.Column('response_body', sa.Text(), nullable=True, comment='Response body from webhook'),
|
||||
sa.Column('error_message', sa.Text(), nullable=True, comment='Error message if failed'),
|
||||
sa.Column('attempt_number', sa.Integer(), nullable=True, comment='Which attempt this was'),
|
||||
sa.Column('delivered_at', sa.DateTime(), nullable=False, comment='Delivery timestamp'),
|
||||
sa.ForeignKeyConstraint(['webhook_id'], ['webhooks.id'], ),
|
||||
sa.ForeignKeyConstraint(['alert_id'], ['alerts.id'], ),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
# Create indexes for webhook_delivery_log
|
||||
op.create_index('idx_webhook_delivery_alert_id', 'webhook_delivery_log', ['alert_id'])
|
||||
op.create_index('idx_webhook_delivery_webhook_id', 'webhook_delivery_log', ['webhook_id'])
|
||||
op.create_index('idx_webhook_delivery_status', 'webhook_delivery_log', ['status'])
|
||||
|
||||
|
||||
def downgrade():
|
||||
"""Remove Phase 5 alert enhancements."""
|
||||
|
||||
# Drop webhook_delivery_log table and its indexes
|
||||
op.drop_index('idx_webhook_delivery_status', table_name='webhook_delivery_log')
|
||||
op.drop_index('idx_webhook_delivery_webhook_id', table_name='webhook_delivery_log')
|
||||
op.drop_index('idx_webhook_delivery_alert_id', table_name='webhook_delivery_log')
|
||||
op.drop_table('webhook_delivery_log')
|
||||
|
||||
# Drop webhooks table
|
||||
op.drop_table('webhooks')
|
||||
|
||||
# Remove enhancements from alerts table
|
||||
with op.batch_alter_table('alerts') as batch_op:
|
||||
batch_op.drop_index('idx_alerts_acknowledged')
|
||||
batch_op.drop_index('idx_alerts_rule_id')
|
||||
batch_op.drop_constraint('fk_alerts_rule_id', type_='foreignkey')
|
||||
batch_op.drop_column('acknowledged_by')
|
||||
batch_op.drop_column('acknowledged_at')
|
||||
batch_op.drop_column('acknowledged')
|
||||
batch_op.drop_column('webhook_sent_at')
|
||||
batch_op.drop_column('webhook_sent')
|
||||
batch_op.drop_column('rule_id')
|
||||
|
||||
# Remove enhancements from alert_rules table
|
||||
with op.batch_alter_table('alert_rules') as batch_op:
|
||||
batch_op.drop_column('updated_at')
|
||||
batch_op.drop_column('config_file')
|
||||
batch_op.drop_column('filter_conditions')
|
||||
batch_op.drop_column('severity')
|
||||
batch_op.drop_column('webhook_enabled')
|
||||
batch_op.drop_column('name')
|
||||
83
app/migrations/versions/005_add_webhook_templates.py
Normal file
83
app/migrations/versions/005_add_webhook_templates.py
Normal file
@@ -0,0 +1,83 @@
|
||||
"""Add webhook template support
|
||||
|
||||
Revision ID: 005
|
||||
Revises: 004
|
||||
Create Date: 2025-11-18
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import json
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic
|
||||
revision = '005'
|
||||
down_revision = '004'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
# Default template that matches the current JSON payload structure
|
||||
DEFAULT_TEMPLATE = """{
|
||||
"event": "alert.created",
|
||||
"alert": {
|
||||
"id": {{ alert.id }},
|
||||
"type": "{{ alert.type }}",
|
||||
"severity": "{{ alert.severity }}",
|
||||
"message": "{{ alert.message }}",
|
||||
{% if alert.ip_address %}"ip_address": "{{ alert.ip_address }}",{% endif %}
|
||||
{% if alert.port %}"port": {{ alert.port }},{% endif %}
|
||||
"acknowledged": {{ alert.acknowledged|lower }},
|
||||
"created_at": "{{ alert.created_at.isoformat() }}"
|
||||
},
|
||||
"scan": {
|
||||
"id": {{ scan.id }},
|
||||
"title": "{{ scan.title }}",
|
||||
"timestamp": "{{ scan.timestamp.isoformat() }}",
|
||||
"status": "{{ scan.status }}"
|
||||
},
|
||||
"rule": {
|
||||
"id": {{ rule.id }},
|
||||
"name": "{{ rule.name }}",
|
||||
"type": "{{ rule.type }}",
|
||||
"threshold": {{ rule.threshold if rule.threshold else 'null' }}
|
||||
}
|
||||
}"""
|
||||
|
||||
|
||||
def upgrade():
|
||||
"""
|
||||
Add webhook template fields:
|
||||
- template: Jinja2 template for payload
|
||||
- template_format: Output format (json, text)
|
||||
- content_type_override: Optional custom Content-Type
|
||||
"""
|
||||
|
||||
# Add new columns to webhooks table
|
||||
with op.batch_alter_table('webhooks') as batch_op:
|
||||
batch_op.add_column(sa.Column('template', sa.Text(), nullable=True, comment='Jinja2 template for webhook payload'))
|
||||
batch_op.add_column(sa.Column('template_format', sa.String(20), nullable=True, server_default='json', comment='Template output format: json, text'))
|
||||
batch_op.add_column(sa.Column('content_type_override', sa.String(100), nullable=True, comment='Optional custom Content-Type header'))
|
||||
|
||||
# Populate existing webhooks with default template
|
||||
# This ensures backward compatibility by converting existing webhooks to use the
|
||||
# same JSON structure they're currently sending
|
||||
connection = op.get_bind()
|
||||
connection.execute(
|
||||
sa.text("""
|
||||
UPDATE webhooks
|
||||
SET template = :template,
|
||||
template_format = 'json'
|
||||
WHERE template IS NULL
|
||||
"""),
|
||||
{"template": DEFAULT_TEMPLATE}
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
"""Remove webhook template fields."""
|
||||
|
||||
with op.batch_alter_table('webhooks') as batch_op:
|
||||
batch_op.drop_column('content_type_override')
|
||||
batch_op.drop_column('template_format')
|
||||
batch_op.drop_column('template')
|
||||
161
app/migrations/versions/006_add_reusable_sites.py
Normal file
161
app/migrations/versions/006_add_reusable_sites.py
Normal file
@@ -0,0 +1,161 @@
|
||||
"""Add reusable site definitions
|
||||
|
||||
Revision ID: 006
|
||||
Revises: 005
|
||||
Create Date: 2025-11-19
|
||||
|
||||
This migration introduces reusable site definitions that can be shared across
|
||||
multiple scans. Sites are defined once with CIDR ranges and can be referenced
|
||||
in multiple scan configurations.
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy import text
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic
|
||||
revision = '006'
|
||||
down_revision = '005'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
"""
|
||||
Create new site tables and migrate existing scan_sites data to the new structure.
|
||||
"""
|
||||
|
||||
# Create sites table (master site definitions)
|
||||
op.create_table('sites',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('name', sa.String(length=255), nullable=False, comment='Unique site name'),
|
||||
sa.Column('description', sa.Text(), nullable=True, comment='Site description'),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False, comment='Site creation time'),
|
||||
sa.Column('updated_at', sa.DateTime(), nullable=False, comment='Last modification time'),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('name', name='uix_site_name')
|
||||
)
|
||||
op.create_index(op.f('ix_sites_name'), 'sites', ['name'], unique=True)
|
||||
|
||||
# Create site_cidrs table (CIDR ranges for each site)
|
||||
op.create_table('site_cidrs',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('site_id', sa.Integer(), nullable=False, comment='FK to sites'),
|
||||
sa.Column('cidr', sa.String(length=45), nullable=False, comment='CIDR notation (e.g., 10.0.0.0/24)'),
|
||||
sa.Column('expected_ping', sa.Boolean(), nullable=True, comment='Expected ping response for this CIDR'),
|
||||
sa.Column('expected_tcp_ports', sa.Text(), nullable=True, comment='JSON array of expected TCP ports'),
|
||||
sa.Column('expected_udp_ports', sa.Text(), nullable=True, comment='JSON array of expected UDP ports'),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False, comment='CIDR creation time'),
|
||||
sa.ForeignKeyConstraint(['site_id'], ['sites.id'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('site_id', 'cidr', name='uix_site_cidr')
|
||||
)
|
||||
op.create_index(op.f('ix_site_cidrs_site_id'), 'site_cidrs', ['site_id'], unique=False)
|
||||
|
||||
# Create site_ips table (IP-level overrides within CIDRs)
|
||||
op.create_table('site_ips',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('site_cidr_id', sa.Integer(), nullable=False, comment='FK to site_cidrs'),
|
||||
sa.Column('ip_address', sa.String(length=45), nullable=False, comment='IPv4 or IPv6 address'),
|
||||
sa.Column('expected_ping', sa.Boolean(), nullable=True, comment='Override ping expectation for this IP'),
|
||||
sa.Column('expected_tcp_ports', sa.Text(), nullable=True, comment='JSON array of expected TCP ports (overrides CIDR)'),
|
||||
sa.Column('expected_udp_ports', sa.Text(), nullable=True, comment='JSON array of expected UDP ports (overrides CIDR)'),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False, comment='IP override creation time'),
|
||||
sa.ForeignKeyConstraint(['site_cidr_id'], ['site_cidrs.id'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('site_cidr_id', 'ip_address', name='uix_site_cidr_ip')
|
||||
)
|
||||
op.create_index(op.f('ix_site_ips_site_cidr_id'), 'site_ips', ['site_cidr_id'], unique=False)
|
||||
|
||||
# Create scan_site_associations table (many-to-many between scans and sites)
|
||||
op.create_table('scan_site_associations',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('scan_id', sa.Integer(), nullable=False, comment='FK to scans'),
|
||||
sa.Column('site_id', sa.Integer(), nullable=False, comment='FK to sites'),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False, comment='Association creation time'),
|
||||
sa.ForeignKeyConstraint(['scan_id'], ['scans.id'], ),
|
||||
sa.ForeignKeyConstraint(['site_id'], ['sites.id'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('scan_id', 'site_id', name='uix_scan_site')
|
||||
)
|
||||
op.create_index(op.f('ix_scan_site_associations_scan_id'), 'scan_site_associations', ['scan_id'], unique=False)
|
||||
op.create_index(op.f('ix_scan_site_associations_site_id'), 'scan_site_associations', ['site_id'], unique=False)
|
||||
|
||||
# Migrate existing data
|
||||
connection = op.get_bind()
|
||||
|
||||
# 1. Extract unique site names from existing scan_sites and create master Site records
|
||||
# This groups all historical scan sites by name and creates one master site per unique name
|
||||
connection.execute(text("""
|
||||
INSERT INTO sites (name, description, created_at, updated_at)
|
||||
SELECT DISTINCT
|
||||
site_name,
|
||||
'Migrated from scan_sites' as description,
|
||||
datetime('now') as created_at,
|
||||
datetime('now') as updated_at
|
||||
FROM scan_sites
|
||||
WHERE site_name NOT IN (SELECT name FROM sites)
|
||||
"""))
|
||||
|
||||
# 2. Create scan_site_associations linking scans to their sites
|
||||
# This maintains the historical relationship between scans and the sites they used
|
||||
connection.execute(text("""
|
||||
INSERT INTO scan_site_associations (scan_id, site_id, created_at)
|
||||
SELECT DISTINCT
|
||||
ss.scan_id,
|
||||
s.id as site_id,
|
||||
datetime('now') as created_at
|
||||
FROM scan_sites ss
|
||||
INNER JOIN sites s ON s.name = ss.site_name
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM scan_site_associations ssa
|
||||
WHERE ssa.scan_id = ss.scan_id AND ssa.site_id = s.id
|
||||
)
|
||||
"""))
|
||||
|
||||
# 3. For each migrated site, create a CIDR entry from the IPs in scan_ips
|
||||
# Since historical data has individual IPs, we'll create /32 CIDRs for each unique IP
|
||||
# This preserves the exact IP addresses while fitting them into the new CIDR-based model
|
||||
connection.execute(text("""
|
||||
INSERT INTO site_cidrs (site_id, cidr, expected_ping, expected_tcp_ports, expected_udp_ports, created_at)
|
||||
SELECT DISTINCT
|
||||
s.id as site_id,
|
||||
si.ip_address || '/32' as cidr,
|
||||
si.ping_expected,
|
||||
'[]' as expected_tcp_ports,
|
||||
'[]' as expected_udp_ports,
|
||||
datetime('now') as created_at
|
||||
FROM scan_ips si
|
||||
INNER JOIN scan_sites ss ON ss.id = si.site_id
|
||||
INNER JOIN sites s ON s.name = ss.site_name
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM site_cidrs sc
|
||||
WHERE sc.site_id = s.id AND sc.cidr = si.ip_address || '/32'
|
||||
)
|
||||
GROUP BY s.id, si.ip_address, si.ping_expected
|
||||
"""))
|
||||
|
||||
print("✓ Migration complete: Reusable sites created from historical scan data")
|
||||
print(f" - Created {connection.execute(text('SELECT COUNT(*) FROM sites')).scalar()} master site(s)")
|
||||
print(f" - Created {connection.execute(text('SELECT COUNT(*) FROM site_cidrs')).scalar()} CIDR range(s)")
|
||||
print(f" - Created {connection.execute(text('SELECT COUNT(*) FROM scan_site_associations')).scalar()} scan-site association(s)")
|
||||
|
||||
|
||||
def downgrade():
|
||||
"""Remove reusable site tables."""
|
||||
|
||||
# Drop tables in reverse order of creation (respecting foreign keys)
|
||||
op.drop_index(op.f('ix_scan_site_associations_site_id'), table_name='scan_site_associations')
|
||||
op.drop_index(op.f('ix_scan_site_associations_scan_id'), table_name='scan_site_associations')
|
||||
op.drop_table('scan_site_associations')
|
||||
|
||||
op.drop_index(op.f('ix_site_ips_site_cidr_id'), table_name='site_ips')
|
||||
op.drop_table('site_ips')
|
||||
|
||||
op.drop_index(op.f('ix_site_cidrs_site_id'), table_name='site_cidrs')
|
||||
op.drop_table('site_cidrs')
|
||||
|
||||
op.drop_index(op.f('ix_sites_name'), table_name='sites')
|
||||
op.drop_table('sites')
|
||||
|
||||
print("✓ Downgrade complete: Reusable site tables removed")
|
||||
102
app/migrations/versions/007_configs_to_database.py
Normal file
102
app/migrations/versions/007_configs_to_database.py
Normal file
@@ -0,0 +1,102 @@
|
||||
"""Add database-stored scan configurations
|
||||
|
||||
Revision ID: 007
|
||||
Revises: 006
|
||||
Create Date: 2025-11-19
|
||||
|
||||
This migration introduces database-stored scan configurations to replace YAML
|
||||
config files. Configs reference sites from the sites table, enabling visual
|
||||
config builder and better data management.
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy import text
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic
|
||||
revision = '007'
|
||||
down_revision = '006'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
"""
|
||||
Create scan_configs and scan_config_sites tables.
|
||||
Add config_id foreign keys to scans and schedules tables.
|
||||
"""
|
||||
|
||||
# Create scan_configs table
|
||||
op.create_table('scan_configs',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('title', sa.String(length=255), nullable=False, comment='Configuration title'),
|
||||
sa.Column('description', sa.Text(), nullable=True, comment='Configuration description'),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False, comment='Config creation time'),
|
||||
sa.Column('updated_at', sa.DateTime(), nullable=False, comment='Last modification time'),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
|
||||
# Create scan_config_sites table (many-to-many between configs and sites)
|
||||
op.create_table('scan_config_sites',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('config_id', sa.Integer(), nullable=False, comment='FK to scan_configs'),
|
||||
sa.Column('site_id', sa.Integer(), nullable=False, comment='FK to sites'),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False, comment='Association creation time'),
|
||||
sa.ForeignKeyConstraint(['config_id'], ['scan_configs.id'], ),
|
||||
sa.ForeignKeyConstraint(['site_id'], ['sites.id'], ),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('config_id', 'site_id', name='uix_config_site')
|
||||
)
|
||||
op.create_index(op.f('ix_scan_config_sites_config_id'), 'scan_config_sites', ['config_id'], unique=False)
|
||||
op.create_index(op.f('ix_scan_config_sites_site_id'), 'scan_config_sites', ['site_id'], unique=False)
|
||||
|
||||
# Add config_id to scans table
|
||||
with op.batch_alter_table('scans', schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column('config_id', sa.Integer(), nullable=True, comment='FK to scan_configs table'))
|
||||
batch_op.create_index('ix_scans_config_id', ['config_id'], unique=False)
|
||||
batch_op.create_foreign_key('fk_scans_config_id', 'scan_configs', ['config_id'], ['id'])
|
||||
# Mark config_file as deprecated in comment (already has nullable=True)
|
||||
|
||||
# Add config_id to schedules table and make config_file nullable
|
||||
with op.batch_alter_table('schedules', schema=None) as batch_op:
|
||||
batch_op.add_column(sa.Column('config_id', sa.Integer(), nullable=True, comment='FK to scan_configs table'))
|
||||
batch_op.create_index('ix_schedules_config_id', ['config_id'], unique=False)
|
||||
batch_op.create_foreign_key('fk_schedules_config_id', 'scan_configs', ['config_id'], ['id'])
|
||||
# Make config_file nullable (it was required before)
|
||||
batch_op.alter_column('config_file', existing_type=sa.Text(), nullable=True)
|
||||
|
||||
connection = op.get_bind()
|
||||
|
||||
print("✓ Migration complete: Scan configs tables created")
|
||||
print(" - Created scan_configs table for database-stored configurations")
|
||||
print(" - Created scan_config_sites association table")
|
||||
print(" - Added config_id to scans table")
|
||||
print(" - Added config_id to schedules table")
|
||||
print(" - Existing YAML configs remain in config_file column for backward compatibility")
|
||||
|
||||
|
||||
def downgrade():
|
||||
"""Remove scan config tables and columns."""
|
||||
|
||||
# Remove foreign keys and columns from schedules
|
||||
with op.batch_alter_table('schedules', schema=None) as batch_op:
|
||||
batch_op.drop_constraint('fk_schedules_config_id', type_='foreignkey')
|
||||
batch_op.drop_index('ix_schedules_config_id')
|
||||
batch_op.drop_column('config_id')
|
||||
# Restore config_file as required
|
||||
batch_op.alter_column('config_file', existing_type=sa.Text(), nullable=False)
|
||||
|
||||
# Remove foreign keys and columns from scans
|
||||
with op.batch_alter_table('scans', schema=None) as batch_op:
|
||||
batch_op.drop_constraint('fk_scans_config_id', type_='foreignkey')
|
||||
batch_op.drop_index('ix_scans_config_id')
|
||||
batch_op.drop_column('config_id')
|
||||
|
||||
# Drop tables in reverse order
|
||||
op.drop_index(op.f('ix_scan_config_sites_site_id'), table_name='scan_config_sites')
|
||||
op.drop_index(op.f('ix_scan_config_sites_config_id'), table_name='scan_config_sites')
|
||||
op.drop_table('scan_config_sites')
|
||||
|
||||
op.drop_table('scan_configs')
|
||||
|
||||
print("✓ Downgrade complete: Scan config tables and columns removed")
|
||||
270
app/migrations/versions/008_expand_cidrs_to_ips.py
Normal file
270
app/migrations/versions/008_expand_cidrs_to_ips.py
Normal file
@@ -0,0 +1,270 @@
|
||||
"""Expand CIDRs to individual IPs with per-IP settings
|
||||
|
||||
Revision ID: 008
|
||||
Revises: 007
|
||||
Create Date: 2025-11-19
|
||||
|
||||
This migration changes the site architecture to automatically expand CIDRs into
|
||||
individual IPs in the database. Each IP has its own port and ping settings.
|
||||
|
||||
Changes:
|
||||
- Add site_id to site_ips (direct link to sites, support standalone IPs)
|
||||
- Make site_cidr_id nullable (IPs can exist without a CIDR parent)
|
||||
- Remove settings from site_cidrs (settings now only at IP level)
|
||||
- Add unique constraint: no duplicate IPs within a site
|
||||
- Expand existing CIDRs to individual IPs
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy import text
|
||||
import ipaddress
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic
|
||||
revision = '008'
|
||||
down_revision = '007'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
"""
|
||||
Modify schema to support per-IP settings and auto-expand CIDRs.
|
||||
"""
|
||||
|
||||
connection = op.get_bind()
|
||||
|
||||
# Check if site_id column already exists
|
||||
inspector = sa.inspect(connection)
|
||||
site_ips_columns = [col['name'] for col in inspector.get_columns('site_ips')]
|
||||
site_cidrs_columns = [col['name'] for col in inspector.get_columns('site_cidrs')]
|
||||
|
||||
# Step 1: Add site_id column to site_ips (will be populated from site_cidr_id)
|
||||
if 'site_id' not in site_ips_columns:
|
||||
print("Adding site_id column to site_ips...")
|
||||
op.add_column('site_ips', sa.Column('site_id', sa.Integer(), nullable=True, comment='FK to sites (direct link)'))
|
||||
else:
|
||||
print("site_id column already exists in site_ips, skipping...")
|
||||
|
||||
# Step 2: Populate site_id from site_cidr_id (before we make it nullable)
|
||||
print("Populating site_id from existing site_cidr relationships...")
|
||||
connection.execute(text("""
|
||||
UPDATE site_ips
|
||||
SET site_id = (
|
||||
SELECT site_id
|
||||
FROM site_cidrs
|
||||
WHERE site_cidrs.id = site_ips.site_cidr_id
|
||||
)
|
||||
WHERE site_cidr_id IS NOT NULL
|
||||
"""))
|
||||
|
||||
# Step 3: Make site_id NOT NULL and add foreign key
|
||||
# Check if foreign key exists before creating
|
||||
try:
|
||||
op.alter_column('site_ips', 'site_id', nullable=False)
|
||||
print("Made site_id NOT NULL")
|
||||
except Exception as e:
|
||||
print(f"site_id already NOT NULL or error: {e}")
|
||||
|
||||
# Check if foreign key exists
|
||||
try:
|
||||
op.create_foreign_key('fk_site_ips_site_id', 'site_ips', 'sites', ['site_id'], ['id'])
|
||||
print("Created foreign key fk_site_ips_site_id")
|
||||
except Exception as e:
|
||||
print(f"Foreign key already exists or error: {e}")
|
||||
|
||||
# Check if index exists
|
||||
try:
|
||||
op.create_index(op.f('ix_site_ips_site_id'), 'site_ips', ['site_id'], unique=False)
|
||||
print("Created index ix_site_ips_site_id")
|
||||
except Exception as e:
|
||||
print(f"Index already exists or error: {e}")
|
||||
|
||||
# Step 4: Make site_cidr_id nullable (for standalone IPs)
|
||||
try:
|
||||
op.alter_column('site_ips', 'site_cidr_id', nullable=True)
|
||||
print("Made site_cidr_id nullable")
|
||||
except Exception as e:
|
||||
print(f"site_cidr_id already nullable or error: {e}")
|
||||
|
||||
# Step 5: Drop old unique constraint and create new one (site_id, ip_address)
|
||||
# This prevents duplicate IPs within a site (across all CIDRs and standalone)
|
||||
try:
|
||||
op.drop_constraint('uix_site_cidr_ip', 'site_ips', type_='unique')
|
||||
print("Dropped old constraint uix_site_cidr_ip")
|
||||
except Exception as e:
|
||||
print(f"Constraint already dropped or doesn't exist: {e}")
|
||||
|
||||
try:
|
||||
op.create_unique_constraint('uix_site_ip_address', 'site_ips', ['site_id', 'ip_address'])
|
||||
print("Created new constraint uix_site_ip_address")
|
||||
except Exception as e:
|
||||
print(f"Constraint already exists or error: {e}")
|
||||
|
||||
# Step 6: Expand existing CIDRs to individual IPs
|
||||
print("Expanding existing CIDRs to individual IPs...")
|
||||
|
||||
# Get all existing CIDRs
|
||||
cidrs = connection.execute(text("""
|
||||
SELECT id, site_id, cidr, expected_ping, expected_tcp_ports, expected_udp_ports
|
||||
FROM site_cidrs
|
||||
""")).fetchall()
|
||||
|
||||
expanded_count = 0
|
||||
skipped_count = 0
|
||||
|
||||
for cidr_row in cidrs:
|
||||
cidr_id, site_id, cidr_str, expected_ping, expected_tcp_ports, expected_udp_ports = cidr_row
|
||||
|
||||
try:
|
||||
# Parse CIDR
|
||||
network = ipaddress.ip_network(cidr_str, strict=False)
|
||||
|
||||
# Check size - skip if too large (> /24 for IPv4, > /64 for IPv6)
|
||||
if isinstance(network, ipaddress.IPv4Network) and network.prefixlen < 24:
|
||||
print(f" ⚠ Skipping large CIDR {cidr_str} (>{network.num_addresses} IPs)")
|
||||
skipped_count += 1
|
||||
continue
|
||||
elif isinstance(network, ipaddress.IPv6Network) and network.prefixlen < 64:
|
||||
print(f" ⚠ Skipping large CIDR {cidr_str} (>{network.num_addresses} IPs)")
|
||||
skipped_count += 1
|
||||
continue
|
||||
|
||||
# Expand to individual IPs
|
||||
for ip in network.hosts() if network.num_addresses > 2 else [network.network_address]:
|
||||
ip_str = str(ip)
|
||||
|
||||
# Check if this IP already exists (from previous IP overrides)
|
||||
existing = connection.execute(text("""
|
||||
SELECT id FROM site_ips
|
||||
WHERE site_cidr_id = :cidr_id AND ip_address = :ip_address
|
||||
"""), {'cidr_id': cidr_id, 'ip_address': ip_str}).fetchone()
|
||||
|
||||
if not existing:
|
||||
# Insert new IP with settings from CIDR
|
||||
connection.execute(text("""
|
||||
INSERT INTO site_ips (
|
||||
site_id, site_cidr_id, ip_address,
|
||||
expected_ping, expected_tcp_ports, expected_udp_ports,
|
||||
created_at
|
||||
)
|
||||
VALUES (
|
||||
:site_id, :cidr_id, :ip_address,
|
||||
:expected_ping, :expected_tcp_ports, :expected_udp_ports,
|
||||
datetime('now')
|
||||
)
|
||||
"""), {
|
||||
'site_id': site_id,
|
||||
'cidr_id': cidr_id,
|
||||
'ip_address': ip_str,
|
||||
'expected_ping': expected_ping,
|
||||
'expected_tcp_ports': expected_tcp_ports,
|
||||
'expected_udp_ports': expected_udp_ports
|
||||
})
|
||||
expanded_count += 1
|
||||
|
||||
except Exception as e:
|
||||
print(f" ✗ Error expanding CIDR {cidr_str}: {e}")
|
||||
skipped_count += 1
|
||||
continue
|
||||
|
||||
print(f" ✓ Expanded {expanded_count} IPs from CIDRs")
|
||||
if skipped_count > 0:
|
||||
print(f" ⚠ Skipped {skipped_count} CIDRs (too large or errors)")
|
||||
|
||||
# Step 7: Remove settings columns from site_cidrs (now only at IP level)
|
||||
print("Removing settings columns from site_cidrs...")
|
||||
# Re-inspect to get current columns
|
||||
site_cidrs_columns = [col['name'] for col in inspector.get_columns('site_cidrs')]
|
||||
|
||||
if 'expected_ping' in site_cidrs_columns:
|
||||
try:
|
||||
op.drop_column('site_cidrs', 'expected_ping')
|
||||
print("Dropped expected_ping from site_cidrs")
|
||||
except Exception as e:
|
||||
print(f"Error dropping expected_ping: {e}")
|
||||
else:
|
||||
print("expected_ping already dropped from site_cidrs")
|
||||
|
||||
if 'expected_tcp_ports' in site_cidrs_columns:
|
||||
try:
|
||||
op.drop_column('site_cidrs', 'expected_tcp_ports')
|
||||
print("Dropped expected_tcp_ports from site_cidrs")
|
||||
except Exception as e:
|
||||
print(f"Error dropping expected_tcp_ports: {e}")
|
||||
else:
|
||||
print("expected_tcp_ports already dropped from site_cidrs")
|
||||
|
||||
if 'expected_udp_ports' in site_cidrs_columns:
|
||||
try:
|
||||
op.drop_column('site_cidrs', 'expected_udp_ports')
|
||||
print("Dropped expected_udp_ports from site_cidrs")
|
||||
except Exception as e:
|
||||
print(f"Error dropping expected_udp_ports: {e}")
|
||||
else:
|
||||
print("expected_udp_ports already dropped from site_cidrs")
|
||||
|
||||
# Print summary
|
||||
total_sites = connection.execute(text('SELECT COUNT(*) FROM sites')).scalar()
|
||||
total_cidrs = connection.execute(text('SELECT COUNT(*) FROM site_cidrs')).scalar()
|
||||
total_ips = connection.execute(text('SELECT COUNT(*) FROM site_ips')).scalar()
|
||||
|
||||
print("\n✓ Migration 008 complete: CIDRs expanded to individual IPs")
|
||||
print(f" - Total sites: {total_sites}")
|
||||
print(f" - Total CIDRs: {total_cidrs}")
|
||||
print(f" - Total IPs: {total_ips}")
|
||||
|
||||
|
||||
def downgrade():
|
||||
"""
|
||||
Revert schema changes (restore CIDR-level settings).
|
||||
Note: This will lose per-IP granularity!
|
||||
"""
|
||||
|
||||
connection = op.get_bind()
|
||||
|
||||
print("Rolling back to CIDR-level settings...")
|
||||
|
||||
# Step 1: Add settings columns back to site_cidrs
|
||||
op.add_column('site_cidrs', sa.Column('expected_ping', sa.Boolean(), nullable=True))
|
||||
op.add_column('site_cidrs', sa.Column('expected_tcp_ports', sa.Text(), nullable=True))
|
||||
op.add_column('site_cidrs', sa.Column('expected_udp_ports', sa.Text(), nullable=True))
|
||||
|
||||
# Step 2: Populate CIDR settings from first IP in each CIDR (approximation)
|
||||
connection.execute(text("""
|
||||
UPDATE site_cidrs
|
||||
SET
|
||||
expected_ping = (
|
||||
SELECT expected_ping FROM site_ips
|
||||
WHERE site_ips.site_cidr_id = site_cidrs.id
|
||||
LIMIT 1
|
||||
),
|
||||
expected_tcp_ports = (
|
||||
SELECT expected_tcp_ports FROM site_ips
|
||||
WHERE site_ips.site_cidr_id = site_cidrs.id
|
||||
LIMIT 1
|
||||
),
|
||||
expected_udp_ports = (
|
||||
SELECT expected_udp_ports FROM site_ips
|
||||
WHERE site_ips.site_cidr_id = site_cidrs.id
|
||||
LIMIT 1
|
||||
)
|
||||
"""))
|
||||
|
||||
# Step 3: Delete auto-expanded IPs (keep only original overrides)
|
||||
# In practice, this is difficult to determine, so we'll keep all IPs
|
||||
# and just remove the schema changes
|
||||
|
||||
# Step 4: Drop new unique constraint and restore old one
|
||||
op.drop_constraint('uix_site_ip_address', 'site_ips', type_='unique')
|
||||
op.create_unique_constraint('uix_site_cidr_ip', 'site_ips', ['site_cidr_id', 'ip_address'])
|
||||
|
||||
# Step 5: Make site_cidr_id NOT NULL again
|
||||
op.alter_column('site_ips', 'site_cidr_id', nullable=False)
|
||||
|
||||
# Step 6: Drop site_id column and related constraints
|
||||
op.drop_index(op.f('ix_site_ips_site_id'), table_name='site_ips')
|
||||
op.drop_constraint('fk_site_ips_site_id', 'site_ips', type_='foreignkey')
|
||||
op.drop_column('site_ips', 'site_id')
|
||||
|
||||
print("✓ Downgrade complete: Reverted to CIDR-level settings")
|
||||
210
app/migrations/versions/009_remove_cidrs.py
Normal file
210
app/migrations/versions/009_remove_cidrs.py
Normal file
@@ -0,0 +1,210 @@
|
||||
"""Remove CIDR table - make sites IP-only
|
||||
|
||||
Revision ID: 009
|
||||
Revises: 008
|
||||
Create Date: 2025-11-19
|
||||
|
||||
This migration removes the SiteCIDR table entirely, making sites purely
|
||||
IP-based. CIDRs are now only used as a convenience for bulk IP addition,
|
||||
not stored as permanent entities.
|
||||
|
||||
Changes:
|
||||
- Set all site_ips.site_cidr_id to NULL (preserve all IPs)
|
||||
- Drop foreign key from site_ips to site_cidrs
|
||||
- Drop site_cidrs table
|
||||
- Remove site_cidr_id column from site_ips
|
||||
|
||||
All existing IPs are preserved. They become "standalone" IPs without
|
||||
a CIDR parent.
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy import text
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic
|
||||
revision = '009'
|
||||
down_revision = '008'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
"""
|
||||
Remove CIDR table and make all IPs standalone.
|
||||
"""
|
||||
|
||||
connection = op.get_bind()
|
||||
inspector = sa.inspect(connection)
|
||||
|
||||
print("\n=== Migration 009: Remove CIDR Table ===\n")
|
||||
|
||||
# Get counts before migration
|
||||
try:
|
||||
total_cidrs = connection.execute(text('SELECT COUNT(*) FROM site_cidrs')).scalar()
|
||||
total_ips = connection.execute(text('SELECT COUNT(*) FROM site_ips')).scalar()
|
||||
ips_with_cidr = connection.execute(text(
|
||||
'SELECT COUNT(*) FROM site_ips WHERE site_cidr_id IS NOT NULL'
|
||||
)).scalar()
|
||||
|
||||
print(f"Before migration:")
|
||||
print(f" - Total CIDRs: {total_cidrs}")
|
||||
print(f" - Total IPs: {total_ips}")
|
||||
print(f" - IPs linked to CIDRs: {ips_with_cidr}")
|
||||
print(f" - Standalone IPs: {total_ips - ips_with_cidr}\n")
|
||||
except Exception as e:
|
||||
print(f"Could not get pre-migration stats: {e}\n")
|
||||
|
||||
# Step 1: Set all site_cidr_id to NULL (preserve all IPs as standalone)
|
||||
print("Step 1: Converting all IPs to standalone (nulling CIDR associations)...")
|
||||
try:
|
||||
result = connection.execute(text("""
|
||||
UPDATE site_ips
|
||||
SET site_cidr_id = NULL
|
||||
WHERE site_cidr_id IS NOT NULL
|
||||
"""))
|
||||
print(f" ✓ Converted {result.rowcount} IPs to standalone\n")
|
||||
except Exception as e:
|
||||
print(f" ⚠ Error or already done: {e}\n")
|
||||
|
||||
# Step 2: Drop foreign key constraint from site_ips to site_cidrs
|
||||
print("Step 2: Dropping foreign key constraint from site_ips to site_cidrs...")
|
||||
foreign_keys = inspector.get_foreign_keys('site_ips')
|
||||
fk_to_drop = None
|
||||
|
||||
for fk in foreign_keys:
|
||||
if fk['referred_table'] == 'site_cidrs':
|
||||
fk_to_drop = fk['name']
|
||||
break
|
||||
|
||||
if fk_to_drop:
|
||||
try:
|
||||
op.drop_constraint(fk_to_drop, 'site_ips', type_='foreignkey')
|
||||
print(f" ✓ Dropped foreign key constraint: {fk_to_drop}\n")
|
||||
except Exception as e:
|
||||
print(f" ⚠ Could not drop foreign key: {e}\n")
|
||||
else:
|
||||
print(" ⚠ Foreign key constraint not found or already dropped\n")
|
||||
|
||||
# Step 3: Drop index on site_cidr_id (if exists)
|
||||
print("Step 3: Dropping index on site_cidr_id...")
|
||||
indexes = inspector.get_indexes('site_ips')
|
||||
index_to_drop = None
|
||||
|
||||
for idx in indexes:
|
||||
if 'site_cidr_id' in idx['column_names']:
|
||||
index_to_drop = idx['name']
|
||||
break
|
||||
|
||||
if index_to_drop:
|
||||
try:
|
||||
op.drop_index(index_to_drop, table_name='site_ips')
|
||||
print(f" ✓ Dropped index: {index_to_drop}\n")
|
||||
except Exception as e:
|
||||
print(f" ⚠ Could not drop index: {e}\n")
|
||||
else:
|
||||
print(" ⚠ Index not found or already dropped\n")
|
||||
|
||||
# Step 4: Drop site_cidrs table
|
||||
print("Step 4: Dropping site_cidrs table...")
|
||||
tables = inspector.get_table_names()
|
||||
|
||||
if 'site_cidrs' in tables:
|
||||
try:
|
||||
op.drop_table('site_cidrs')
|
||||
print(" ✓ Dropped site_cidrs table\n")
|
||||
except Exception as e:
|
||||
print(f" ⚠ Could not drop table: {e}\n")
|
||||
else:
|
||||
print(" ⚠ Table site_cidrs not found or already dropped\n")
|
||||
|
||||
# Step 5: Drop site_cidr_id column from site_ips
|
||||
print("Step 5: Dropping site_cidr_id column from site_ips...")
|
||||
site_ips_columns = [col['name'] for col in inspector.get_columns('site_ips')]
|
||||
|
||||
if 'site_cidr_id' in site_ips_columns:
|
||||
try:
|
||||
op.drop_column('site_ips', 'site_cidr_id')
|
||||
print(" ✓ Dropped site_cidr_id column from site_ips\n")
|
||||
except Exception as e:
|
||||
print(f" ⚠ Could not drop column: {e}\n")
|
||||
else:
|
||||
print(" ⚠ Column site_cidr_id not found or already dropped\n")
|
||||
|
||||
# Get counts after migration
|
||||
try:
|
||||
final_ips = connection.execute(text('SELECT COUNT(*) FROM site_ips')).scalar()
|
||||
total_sites = connection.execute(text('SELECT COUNT(*) FROM sites')).scalar()
|
||||
|
||||
print("After migration:")
|
||||
print(f" - Total sites: {total_sites}")
|
||||
print(f" - Total IPs (all standalone): {final_ips}")
|
||||
print(f" - CIDRs: N/A (table removed)")
|
||||
except Exception as e:
|
||||
print(f"Could not get post-migration stats: {e}")
|
||||
|
||||
print("\n✓ Migration 009 complete: Sites are now IP-only")
|
||||
print(" All IPs preserved as standalone. CIDRs can still be used")
|
||||
print(" via the API/UI for bulk IP creation, but are not stored.\n")
|
||||
|
||||
|
||||
def downgrade():
|
||||
"""
|
||||
Recreate site_cidrs table (CANNOT restore original CIDR associations).
|
||||
|
||||
WARNING: This downgrade creates an empty site_cidrs table structure but
|
||||
cannot restore the original CIDR-to-IP associations since that data was
|
||||
deleted. All IPs will remain standalone.
|
||||
"""
|
||||
|
||||
connection = op.get_bind()
|
||||
|
||||
print("\n=== Downgrade 009: Recreate CIDR Table Structure ===\n")
|
||||
print("⚠ WARNING: Cannot restore original CIDR associations!")
|
||||
print(" The site_cidrs table structure will be recreated but will be empty.")
|
||||
print(" All IPs will remain standalone. This is a PARTIAL downgrade.\n")
|
||||
|
||||
# Step 1: Recreate site_cidrs table (empty)
|
||||
print("Step 1: Recreating site_cidrs table structure...")
|
||||
try:
|
||||
op.create_table(
|
||||
'site_cidrs',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('site_id', sa.Integer(), nullable=False),
|
||||
sa.Column('cidr', sa.String(length=45), nullable=False, comment='CIDR notation (e.g., 10.0.0.0/24)'),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.ForeignKeyConstraint(['site_id'], ['sites.id'], ),
|
||||
sa.UniqueConstraint('site_id', 'cidr', name='uix_site_cidr')
|
||||
)
|
||||
print(" ✓ Recreated site_cidrs table (empty)\n")
|
||||
except Exception as e:
|
||||
print(f" ⚠ Could not create table: {e}\n")
|
||||
|
||||
# Step 2: Add site_cidr_id column back to site_ips (nullable)
|
||||
print("Step 2: Adding site_cidr_id column back to site_ips...")
|
||||
try:
|
||||
op.add_column('site_ips', sa.Column('site_cidr_id', sa.Integer(), nullable=True, comment='FK to site_cidrs (optional, for grouping)'))
|
||||
print(" ✓ Added site_cidr_id column (nullable)\n")
|
||||
except Exception as e:
|
||||
print(f" ⚠ Could not add column: {e}\n")
|
||||
|
||||
# Step 3: Add foreign key constraint
|
||||
print("Step 3: Adding foreign key constraint...")
|
||||
try:
|
||||
op.create_foreign_key('fk_site_ips_site_cidr_id', 'site_ips', 'site_cidrs', ['site_cidr_id'], ['id'])
|
||||
print(" ✓ Created foreign key constraint\n")
|
||||
except Exception as e:
|
||||
print(f" ⚠ Could not create foreign key: {e}\n")
|
||||
|
||||
# Step 4: Add index on site_cidr_id
|
||||
print("Step 4: Adding index on site_cidr_id...")
|
||||
try:
|
||||
op.create_index('ix_site_ips_site_cidr_id', 'site_ips', ['site_cidr_id'], unique=False)
|
||||
print(" ✓ Created index on site_cidr_id\n")
|
||||
except Exception as e:
|
||||
print(f" ⚠ Could not create index: {e}\n")
|
||||
|
||||
print("✓ Downgrade complete: CIDR table structure restored (but empty)")
|
||||
print(" All IPs remain standalone. You would need to manually recreate")
|
||||
print(" CIDR records and associate IPs with them.\n")
|
||||
53
app/migrations/versions/010_alert_rules_config_id.py
Normal file
53
app/migrations/versions/010_alert_rules_config_id.py
Normal file
@@ -0,0 +1,53 @@
|
||||
"""Add config_id to alert_rules table
|
||||
|
||||
Revision ID: 010
|
||||
Revises: 009
|
||||
Create Date: 2025-11-19
|
||||
|
||||
This migration adds config_id foreign key to alert_rules table to replace
|
||||
the config_file column, completing the migration from file-based to
|
||||
database-based configurations.
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic
|
||||
revision = '010'
|
||||
down_revision = '009'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
"""
|
||||
Add config_id to alert_rules table and remove config_file.
|
||||
"""
|
||||
|
||||
with op.batch_alter_table('alert_rules', schema=None) as batch_op:
|
||||
# Add config_id column with foreign key
|
||||
batch_op.add_column(sa.Column('config_id', sa.Integer(), nullable=True, comment='FK to scan_configs table'))
|
||||
batch_op.create_index('ix_alert_rules_config_id', ['config_id'], unique=False)
|
||||
batch_op.create_foreign_key('fk_alert_rules_config_id', 'scan_configs', ['config_id'], ['id'])
|
||||
|
||||
# Remove the old config_file column
|
||||
batch_op.drop_column('config_file')
|
||||
|
||||
print("✓ Migration complete: AlertRule now uses config_id")
|
||||
print(" - Added config_id foreign key to alert_rules table")
|
||||
print(" - Removed deprecated config_file column")
|
||||
|
||||
|
||||
def downgrade():
|
||||
"""Remove config_id and restore config_file on alert_rules."""
|
||||
|
||||
with op.batch_alter_table('alert_rules', schema=None) as batch_op:
|
||||
# Remove foreign key and config_id column
|
||||
batch_op.drop_constraint('fk_alert_rules_config_id', type_='foreignkey')
|
||||
batch_op.drop_index('ix_alert_rules_config_id')
|
||||
batch_op.drop_column('config_id')
|
||||
|
||||
# Restore config_file column
|
||||
batch_op.add_column(sa.Column('config_file', sa.String(255), nullable=True, comment='Optional: specific config file this rule applies to'))
|
||||
|
||||
print("✓ Downgrade complete: AlertRule config_id removed, config_file restored")
|
||||
86
app/migrations/versions/011_drop_config_file.py
Normal file
86
app/migrations/versions/011_drop_config_file.py
Normal file
@@ -0,0 +1,86 @@
|
||||
"""Drop deprecated config_file columns
|
||||
|
||||
Revision ID: 011
|
||||
Revises: 010
|
||||
Create Date: 2025-11-19
|
||||
|
||||
This migration removes the deprecated config_file columns from scans and schedules
|
||||
tables. All functionality now uses config_id to reference database-stored configs.
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic
|
||||
revision = '011'
|
||||
down_revision = '010'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
"""
|
||||
Drop config_file columns from scans and schedules tables.
|
||||
|
||||
Prerequisites:
|
||||
- All scans must have config_id set
|
||||
- All schedules must have config_id set
|
||||
- Code must be updated to no longer reference config_file
|
||||
"""
|
||||
|
||||
connection = op.get_bind()
|
||||
|
||||
# Check for any records missing config_id
|
||||
result = connection.execute(sa.text(
|
||||
"SELECT COUNT(*) FROM scans WHERE config_id IS NULL"
|
||||
))
|
||||
scans_without_config = result.scalar()
|
||||
|
||||
result = connection.execute(sa.text(
|
||||
"SELECT COUNT(*) FROM schedules WHERE config_id IS NULL"
|
||||
))
|
||||
schedules_without_config = result.scalar()
|
||||
|
||||
if scans_without_config > 0:
|
||||
print(f"WARNING: {scans_without_config} scans have NULL config_id")
|
||||
print(" These scans will lose their config reference after migration")
|
||||
|
||||
if schedules_without_config > 0:
|
||||
raise Exception(
|
||||
f"Cannot proceed: {schedules_without_config} schedules have NULL config_id. "
|
||||
"Please set config_id for all schedules before running this migration."
|
||||
)
|
||||
|
||||
# Drop config_file from scans table
|
||||
with op.batch_alter_table('scans', schema=None) as batch_op:
|
||||
batch_op.drop_column('config_file')
|
||||
|
||||
# Drop config_file from schedules table
|
||||
with op.batch_alter_table('schedules', schema=None) as batch_op:
|
||||
batch_op.drop_column('config_file')
|
||||
|
||||
print("✓ Migration complete: Dropped config_file columns")
|
||||
print(" - Removed config_file from scans table")
|
||||
print(" - Removed config_file from schedules table")
|
||||
print(" - All references should now use config_id")
|
||||
|
||||
|
||||
def downgrade():
|
||||
"""Re-add config_file columns (data will be lost)."""
|
||||
|
||||
# Add config_file back to scans
|
||||
with op.batch_alter_table('scans', schema=None) as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column('config_file', sa.Text(), nullable=True,
|
||||
comment='Path to YAML config used (deprecated)')
|
||||
)
|
||||
|
||||
# Add config_file back to schedules
|
||||
with op.batch_alter_table('schedules', schema=None) as batch_op:
|
||||
batch_op.add_column(
|
||||
sa.Column('config_file', sa.Text(), nullable=True,
|
||||
comment='Path to YAML config (deprecated)')
|
||||
)
|
||||
|
||||
print("✓ Downgrade complete: Re-added config_file columns")
|
||||
print(" WARNING: config_file values are lost and will be NULL")
|
||||
58
app/migrations/versions/012_add_scan_progress.py
Normal file
58
app/migrations/versions/012_add_scan_progress.py
Normal file
@@ -0,0 +1,58 @@
|
||||
"""Add scan progress tracking
|
||||
|
||||
Revision ID: 012
|
||||
Revises: 011
|
||||
Create Date: 2024-01-01 00:00:00.000000
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '012'
|
||||
down_revision = '011'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# Add progress tracking columns to scans table
|
||||
op.add_column('scans', sa.Column('current_phase', sa.String(50), nullable=True,
|
||||
comment='Current scan phase: ping, tcp_scan, udp_scan, service_detection, http_analysis'))
|
||||
op.add_column('scans', sa.Column('total_ips', sa.Integer(), nullable=True,
|
||||
comment='Total number of IPs to scan'))
|
||||
op.add_column('scans', sa.Column('completed_ips', sa.Integer(), nullable=True, default=0,
|
||||
comment='Number of IPs completed in current phase'))
|
||||
|
||||
# Create scan_progress table for per-IP progress tracking
|
||||
op.create_table(
|
||||
'scan_progress',
|
||||
sa.Column('id', sa.Integer(), primary_key=True, autoincrement=True),
|
||||
sa.Column('scan_id', sa.Integer(), sa.ForeignKey('scans.id'), nullable=False, index=True),
|
||||
sa.Column('ip_address', sa.String(45), nullable=False, comment='IP address being scanned'),
|
||||
sa.Column('site_name', sa.String(255), nullable=True, comment='Site name this IP belongs to'),
|
||||
sa.Column('phase', sa.String(50), nullable=False,
|
||||
comment='Phase: ping, tcp_scan, udp_scan, service_detection, http_analysis'),
|
||||
sa.Column('status', sa.String(20), nullable=False, default='pending',
|
||||
comment='pending, in_progress, completed, failed'),
|
||||
sa.Column('ping_result', sa.Boolean(), nullable=True, comment='Ping response result'),
|
||||
sa.Column('tcp_ports', sa.Text(), nullable=True, comment='JSON array of discovered TCP ports'),
|
||||
sa.Column('udp_ports', sa.Text(), nullable=True, comment='JSON array of discovered UDP ports'),
|
||||
sa.Column('services', sa.Text(), nullable=True, comment='JSON array of detected services'),
|
||||
sa.Column('created_at', sa.DateTime(), nullable=False, server_default=sa.func.now(),
|
||||
comment='Entry creation time'),
|
||||
sa.Column('updated_at', sa.DateTime(), nullable=False, server_default=sa.func.now(),
|
||||
onupdate=sa.func.now(), comment='Last update time'),
|
||||
sa.UniqueConstraint('scan_id', 'ip_address', name='uix_scan_progress_ip')
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
# Drop scan_progress table
|
||||
op.drop_table('scan_progress')
|
||||
|
||||
# Remove progress tracking columns from scans table
|
||||
op.drop_column('scans', 'completed_ips')
|
||||
op.drop_column('scans', 'total_ips')
|
||||
op.drop_column('scans', 'current_phase')
|
||||
@@ -12,7 +12,7 @@ alembic==1.13.0
|
||||
# Authentication & Security
|
||||
Flask-Login==0.6.3
|
||||
bcrypt==4.1.2
|
||||
cryptography==41.0.7
|
||||
cryptography>=46.0.0
|
||||
|
||||
# API & Serialization
|
||||
Flask-CORS==4.0.0
|
||||
@@ -26,9 +26,12 @@ croniter==2.0.1
|
||||
# Email Support (Phase 4)
|
||||
Flask-Mail==0.9.1
|
||||
|
||||
# Webhook Support (Phase 5)
|
||||
requests==2.31.0
|
||||
|
||||
# Configuration Management
|
||||
python-dotenv==1.0.0
|
||||
|
||||
# Development & Testing
|
||||
pytest==7.4.3
|
||||
pytest-flask==1.3.0
|
||||
pytest-flask==1.3.0
|
||||
@@ -1,5 +1,5 @@
|
||||
PyYAML==6.0.1
|
||||
python-libnmap==0.7.3
|
||||
sslyze==6.0.0
|
||||
sslyze==6.2.0
|
||||
playwright==1.40.0
|
||||
Jinja2==3.1.2
|
||||
|
||||
@@ -78,7 +78,7 @@ class HTMLReportGenerator:
|
||||
'title': self.report_data.get('title', 'SneakyScanner Report'),
|
||||
'scan_time': self.report_data.get('scan_time'),
|
||||
'scan_duration': self.report_data.get('scan_duration'),
|
||||
'config_file': self.report_data.get('config_file'),
|
||||
'config_id': self.report_data.get('config_id'),
|
||||
'sites': self.report_data.get('sites', []),
|
||||
'summary_stats': summary_stats,
|
||||
'drift_alerts': drift_alerts,
|
||||
|
||||
@@ -6,14 +6,17 @@ SneakyScanner - Masscan-based network scanner with YAML configuration
|
||||
import argparse
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import signal
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
import threading
|
||||
import time
|
||||
import zipfile
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Any
|
||||
from typing import Dict, List, Any, Callable, Optional
|
||||
import xml.etree.ElementTree as ET
|
||||
|
||||
import yaml
|
||||
@@ -22,24 +25,93 @@ from libnmap.parser import NmapParser
|
||||
|
||||
from src.screenshot_capture import ScreenshotCapture
|
||||
from src.report_generator import HTMLReportGenerator
|
||||
from web.config import NMAP_HOST_TIMEOUT
|
||||
|
||||
# Force unbuffered output for Docker
|
||||
sys.stdout.reconfigure(line_buffering=True)
|
||||
sys.stderr.reconfigure(line_buffering=True)
|
||||
|
||||
|
||||
class SneakyScanner:
|
||||
"""Wrapper for masscan to perform network scans based on YAML config"""
|
||||
class ScanCancelledError(Exception):
|
||||
"""Raised when a scan is cancelled by the user."""
|
||||
pass
|
||||
|
||||
def __init__(self, config_path: str, output_dir: str = "/app/output"):
|
||||
self.config_path = Path(config_path)
|
||||
|
||||
class SneakyScanner:
|
||||
"""Wrapper for masscan to perform network scans based on YAML config or database config"""
|
||||
|
||||
def __init__(self, config_path: str = None, config_id: int = None, config_dict: Dict = None, output_dir: str = "/app/output"):
|
||||
"""
|
||||
Initialize scanner with configuration.
|
||||
|
||||
Args:
|
||||
config_path: Path to YAML config file (legacy)
|
||||
config_id: Database config ID (preferred)
|
||||
config_dict: Config dictionary (for direct use)
|
||||
output_dir: Output directory for scan results
|
||||
|
||||
Note: Provide exactly one of config_path, config_id, or config_dict
|
||||
"""
|
||||
if sum([config_path is not None, config_id is not None, config_dict is not None]) != 1:
|
||||
raise ValueError("Must provide exactly one of: config_path, config_id, or config_dict")
|
||||
|
||||
self.config_path = Path(config_path) if config_path else None
|
||||
self.config_id = config_id
|
||||
self.output_dir = Path(output_dir)
|
||||
self.output_dir.mkdir(parents=True, exist_ok=True)
|
||||
self.config = self._load_config()
|
||||
|
||||
if config_dict:
|
||||
self.config = config_dict
|
||||
# Process sites: resolve references and expand CIDRs
|
||||
if 'sites' in self.config:
|
||||
self.config['sites'] = self._resolve_sites(self.config['sites'])
|
||||
else:
|
||||
self.config = self._load_config()
|
||||
|
||||
self.screenshot_capture = None
|
||||
|
||||
# Cancellation support
|
||||
self._cancelled = False
|
||||
self._cancel_lock = threading.Lock()
|
||||
self._active_process = None
|
||||
self._process_lock = threading.Lock()
|
||||
|
||||
def cancel(self):
|
||||
"""
|
||||
Cancel the running scan.
|
||||
|
||||
Terminates any active subprocess and sets cancellation flag.
|
||||
"""
|
||||
with self._cancel_lock:
|
||||
self._cancelled = True
|
||||
|
||||
with self._process_lock:
|
||||
if self._active_process and self._active_process.poll() is None:
|
||||
try:
|
||||
# Terminate the process group
|
||||
os.killpg(os.getpgid(self._active_process.pid), signal.SIGTERM)
|
||||
except (ProcessLookupError, OSError):
|
||||
pass
|
||||
|
||||
def is_cancelled(self) -> bool:
|
||||
"""Check if scan has been cancelled."""
|
||||
with self._cancel_lock:
|
||||
return self._cancelled
|
||||
|
||||
def _load_config(self) -> Dict[str, Any]:
|
||||
"""Load and validate YAML configuration"""
|
||||
"""
|
||||
Load and validate configuration from file or database.
|
||||
|
||||
Supports three formats:
|
||||
1. Legacy: Sites with explicit IP lists
|
||||
2. Site references: Sites referencing database-stored sites
|
||||
3. Inline CIDRs: Sites with CIDR ranges
|
||||
"""
|
||||
# Load from database if config_id provided
|
||||
if self.config_id:
|
||||
return self._load_config_from_database(self.config_id)
|
||||
|
||||
# Load from YAML file
|
||||
if not self.config_path.exists():
|
||||
raise FileNotFoundError(f"Config file not found: {self.config_path}")
|
||||
|
||||
@@ -51,8 +123,256 @@ class SneakyScanner:
|
||||
if not config.get('sites'):
|
||||
raise ValueError("Config must include 'sites' field")
|
||||
|
||||
# Process sites: resolve references and expand CIDRs
|
||||
config['sites'] = self._resolve_sites(config['sites'])
|
||||
|
||||
return config
|
||||
|
||||
def _load_config_from_database(self, config_id: int) -> Dict[str, Any]:
|
||||
"""
|
||||
Load configuration from database by ID.
|
||||
|
||||
Args:
|
||||
config_id: Database config ID
|
||||
|
||||
Returns:
|
||||
Config dictionary with expanded sites
|
||||
|
||||
Raises:
|
||||
ValueError: If config not found or invalid
|
||||
"""
|
||||
try:
|
||||
# Import here to avoid circular dependencies and allow scanner to work standalone
|
||||
import os
|
||||
import sys
|
||||
|
||||
# Add parent directory to path for imports
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from web.models import ScanConfig
|
||||
|
||||
# Create database session
|
||||
db_url = os.environ.get('DATABASE_URL', 'sqlite:////app/data/sneakyscanner.db')
|
||||
engine = create_engine(db_url)
|
||||
Session = sessionmaker(bind=engine)
|
||||
session = Session()
|
||||
|
||||
try:
|
||||
# Load config from database
|
||||
db_config = session.query(ScanConfig).filter_by(id=config_id).first()
|
||||
|
||||
if not db_config:
|
||||
raise ValueError(f"Config with ID {config_id} not found in database")
|
||||
|
||||
# Build config dict with site references
|
||||
config = {
|
||||
'title': db_config.title,
|
||||
'sites': []
|
||||
}
|
||||
|
||||
# Add each site as a site_ref
|
||||
for assoc in db_config.site_associations:
|
||||
site = assoc.site
|
||||
config['sites'].append({
|
||||
'site_ref': site.name
|
||||
})
|
||||
|
||||
# Process sites: resolve references and expand CIDRs
|
||||
config['sites'] = self._resolve_sites(config['sites'])
|
||||
|
||||
return config
|
||||
|
||||
finally:
|
||||
session.close()
|
||||
|
||||
except ImportError as e:
|
||||
raise ValueError(f"Failed to load config from database (import error): {str(e)}")
|
||||
except Exception as e:
|
||||
raise ValueError(f"Failed to load config from database: {str(e)}")
|
||||
|
||||
def _resolve_sites(self, sites: List[Dict]) -> List[Dict]:
|
||||
"""
|
||||
Resolve site references and expand CIDRs to IP lists.
|
||||
|
||||
Converts all site formats into the legacy format (with explicit IPs)
|
||||
for compatibility with the existing scan logic.
|
||||
|
||||
Args:
|
||||
sites: List of site definitions from config
|
||||
|
||||
Returns:
|
||||
List of sites with expanded IP lists
|
||||
"""
|
||||
import ipaddress
|
||||
|
||||
resolved_sites = []
|
||||
|
||||
for site_def in sites:
|
||||
# Handle site references
|
||||
if 'site_ref' in site_def:
|
||||
site_ref = site_def['site_ref']
|
||||
# Load site from database
|
||||
site_data = self._load_site_from_database(site_ref)
|
||||
if site_data:
|
||||
resolved_sites.append(site_data)
|
||||
else:
|
||||
print(f"WARNING: Site reference '{site_ref}' not found in database", file=sys.stderr)
|
||||
continue
|
||||
|
||||
# Handle inline CIDR definitions
|
||||
if 'cidrs' in site_def:
|
||||
site_name = site_def.get('name', 'Unknown Site')
|
||||
expanded_ips = []
|
||||
|
||||
for cidr_def in site_def['cidrs']:
|
||||
cidr = cidr_def['cidr']
|
||||
expected_ping = cidr_def.get('expected_ping', False)
|
||||
expected_tcp_ports = cidr_def.get('expected_tcp_ports', [])
|
||||
expected_udp_ports = cidr_def.get('expected_udp_ports', [])
|
||||
|
||||
# Check if there are IP-level overrides (from database sites)
|
||||
ip_overrides = cidr_def.get('ip_overrides', [])
|
||||
override_map = {
|
||||
override['ip_address']: override
|
||||
for override in ip_overrides
|
||||
}
|
||||
|
||||
# Expand CIDR to IP list
|
||||
try:
|
||||
network = ipaddress.ip_network(cidr, strict=False)
|
||||
ip_list = [str(ip) for ip in network.hosts()]
|
||||
|
||||
# If network has only 1 address (like /32), hosts() returns empty
|
||||
if not ip_list:
|
||||
ip_list = [str(network.network_address)]
|
||||
|
||||
# Create IP config for each IP in the CIDR
|
||||
for ip_address in ip_list:
|
||||
# Check if this IP has an override
|
||||
if ip_address in override_map:
|
||||
override = override_map[ip_address]
|
||||
ip_config = {
|
||||
'address': ip_address,
|
||||
'expected': {
|
||||
'ping': override.get('expected_ping', expected_ping),
|
||||
'tcp_ports': override.get('expected_tcp_ports', expected_tcp_ports),
|
||||
'udp_ports': override.get('expected_udp_ports', expected_udp_ports)
|
||||
}
|
||||
}
|
||||
else:
|
||||
# Use CIDR-level defaults
|
||||
ip_config = {
|
||||
'address': ip_address,
|
||||
'expected': {
|
||||
'ping': expected_ping,
|
||||
'tcp_ports': expected_tcp_ports,
|
||||
'udp_ports': expected_udp_ports
|
||||
}
|
||||
}
|
||||
|
||||
expanded_ips.append(ip_config)
|
||||
|
||||
except ValueError as e:
|
||||
print(f"WARNING: Invalid CIDR '{cidr}': {e}", file=sys.stderr)
|
||||
continue
|
||||
|
||||
# Add expanded site
|
||||
resolved_sites.append({
|
||||
'name': site_name,
|
||||
'ips': expanded_ips
|
||||
})
|
||||
continue
|
||||
|
||||
# Legacy format: already has 'ips' list
|
||||
if 'ips' in site_def:
|
||||
resolved_sites.append(site_def)
|
||||
continue
|
||||
|
||||
print(f"WARNING: Site definition missing required fields: {site_def}", file=sys.stderr)
|
||||
|
||||
return resolved_sites
|
||||
|
||||
def _load_site_from_database(self, site_name: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Load a site definition from the database.
|
||||
|
||||
IPs are pre-expanded in the database, so we just load them directly.
|
||||
|
||||
Args:
|
||||
site_name: Name of the site to load
|
||||
|
||||
Returns:
|
||||
Site definition dict with IPs, or None if not found
|
||||
"""
|
||||
try:
|
||||
# Import database modules
|
||||
import os
|
||||
import sys
|
||||
|
||||
# Add parent directory to path if needed
|
||||
parent_dir = str(Path(__file__).parent.parent)
|
||||
if parent_dir not in sys.path:
|
||||
sys.path.insert(0, parent_dir)
|
||||
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker, joinedload
|
||||
from web.models import Site
|
||||
|
||||
# Get database URL from environment
|
||||
database_url = os.environ.get('DATABASE_URL', 'sqlite:///./sneakyscanner.db')
|
||||
|
||||
# Create engine and session
|
||||
engine = create_engine(database_url)
|
||||
Session = sessionmaker(bind=engine)
|
||||
session = Session()
|
||||
|
||||
# Query site with all IPs (CIDRs are already expanded)
|
||||
site = (
|
||||
session.query(Site)
|
||||
.options(joinedload(Site.ips))
|
||||
.filter(Site.name == site_name)
|
||||
.first()
|
||||
)
|
||||
|
||||
if not site:
|
||||
session.close()
|
||||
return None
|
||||
|
||||
# Load all IPs directly from database (already expanded)
|
||||
expanded_ips = []
|
||||
|
||||
for ip_obj in site.ips:
|
||||
# Get settings from IP (no need to merge with CIDR defaults)
|
||||
expected_ping = ip_obj.expected_ping if ip_obj.expected_ping is not None else False
|
||||
expected_tcp_ports = json.loads(ip_obj.expected_tcp_ports) if ip_obj.expected_tcp_ports else []
|
||||
expected_udp_ports = json.loads(ip_obj.expected_udp_ports) if ip_obj.expected_udp_ports else []
|
||||
|
||||
ip_config = {
|
||||
'address': ip_obj.ip_address,
|
||||
'expected': {
|
||||
'ping': expected_ping,
|
||||
'tcp_ports': expected_tcp_ports,
|
||||
'udp_ports': expected_udp_ports
|
||||
}
|
||||
}
|
||||
|
||||
expanded_ips.append(ip_config)
|
||||
|
||||
session.close()
|
||||
|
||||
return {
|
||||
'name': site.name,
|
||||
'ips': expanded_ips
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
print(f"ERROR: Failed to load site '{site_name}' from database: {e}", file=sys.stderr)
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
|
||||
def _run_masscan(self, targets: List[str], ports: str, protocol: str) -> List[Dict]:
|
||||
"""
|
||||
Run masscan and return parsed results
|
||||
@@ -98,11 +418,31 @@ class SneakyScanner:
|
||||
raise ValueError(f"Invalid protocol: {protocol}")
|
||||
|
||||
print(f"Running: {' '.join(cmd)}", flush=True)
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
|
||||
# Use Popen for cancellation support
|
||||
with self._process_lock:
|
||||
self._active_process = subprocess.Popen(
|
||||
cmd,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
text=True,
|
||||
start_new_session=True
|
||||
)
|
||||
|
||||
stdout, stderr = self._active_process.communicate()
|
||||
returncode = self._active_process.returncode
|
||||
|
||||
with self._process_lock:
|
||||
self._active_process = None
|
||||
|
||||
# Check if cancelled
|
||||
if self.is_cancelled():
|
||||
return []
|
||||
|
||||
print(f"Masscan {protocol.upper()} scan completed", flush=True)
|
||||
|
||||
if result.returncode != 0:
|
||||
print(f"Masscan stderr: {result.stderr}", file=sys.stderr)
|
||||
if returncode != 0:
|
||||
print(f"Masscan stderr: {stderr}", file=sys.stderr)
|
||||
|
||||
# Parse masscan JSON output
|
||||
results = []
|
||||
@@ -150,11 +490,31 @@ class SneakyScanner:
|
||||
]
|
||||
|
||||
print(f"Running: {' '.join(cmd)}", flush=True)
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
|
||||
# Use Popen for cancellation support
|
||||
with self._process_lock:
|
||||
self._active_process = subprocess.Popen(
|
||||
cmd,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
text=True,
|
||||
start_new_session=True
|
||||
)
|
||||
|
||||
stdout, stderr = self._active_process.communicate()
|
||||
returncode = self._active_process.returncode
|
||||
|
||||
with self._process_lock:
|
||||
self._active_process = None
|
||||
|
||||
# Check if cancelled
|
||||
if self.is_cancelled():
|
||||
return {}
|
||||
|
||||
print(f"Masscan PING scan completed", flush=True)
|
||||
|
||||
if result.returncode != 0:
|
||||
print(f"Masscan stderr: {result.stderr}", file=sys.stderr, flush=True)
|
||||
if returncode != 0:
|
||||
print(f"Masscan stderr: {stderr}", file=sys.stderr, flush=True)
|
||||
|
||||
# Parse results
|
||||
responding_ips = set()
|
||||
@@ -192,6 +552,10 @@ class SneakyScanner:
|
||||
all_services = {}
|
||||
|
||||
for ip, ports in ip_ports.items():
|
||||
# Check if cancelled before each host
|
||||
if self.is_cancelled():
|
||||
break
|
||||
|
||||
if not ports:
|
||||
all_services[ip] = []
|
||||
continue
|
||||
@@ -213,14 +577,33 @@ class SneakyScanner:
|
||||
'--version-intensity', '5', # Balanced speed/accuracy
|
||||
'-p', port_list,
|
||||
'-oX', xml_output, # XML output
|
||||
'--host-timeout', '5m', # Timeout per host
|
||||
'--host-timeout', NMAP_HOST_TIMEOUT, # Timeout per host
|
||||
ip
|
||||
]
|
||||
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=600)
|
||||
# Use Popen for cancellation support
|
||||
with self._process_lock:
|
||||
self._active_process = subprocess.Popen(
|
||||
cmd,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
text=True,
|
||||
start_new_session=True
|
||||
)
|
||||
|
||||
if result.returncode != 0:
|
||||
print(f" Nmap warning for {ip}: {result.stderr}", file=sys.stderr, flush=True)
|
||||
stdout, stderr = self._active_process.communicate(timeout=600)
|
||||
returncode = self._active_process.returncode
|
||||
|
||||
with self._process_lock:
|
||||
self._active_process = None
|
||||
|
||||
# Check if cancelled
|
||||
if self.is_cancelled():
|
||||
Path(xml_output).unlink(missing_ok=True)
|
||||
break
|
||||
|
||||
if returncode != 0:
|
||||
print(f" Nmap warning for {ip}: {stderr}", file=sys.stderr, flush=True)
|
||||
|
||||
# Parse XML output
|
||||
services = self._parse_nmap_xml(xml_output)
|
||||
@@ -293,29 +676,57 @@ class SneakyScanner:
|
||||
|
||||
return services
|
||||
|
||||
def _is_likely_web_service(self, service: Dict) -> bool:
|
||||
def _is_likely_web_service(self, service: Dict, ip: str = None) -> bool:
|
||||
"""
|
||||
Check if a service is likely HTTP/HTTPS based on nmap detection or common web ports
|
||||
Check if a service is a web server by actually making an HTTP request
|
||||
|
||||
Args:
|
||||
service: Service dictionary from nmap results
|
||||
ip: IP address to test (required for HTTP probe)
|
||||
|
||||
Returns:
|
||||
True if service appears to be web-related
|
||||
True if service responds to HTTP/HTTPS requests
|
||||
"""
|
||||
# Check service name
|
||||
import requests
|
||||
import urllib3
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
|
||||
# Quick check for known web service names first
|
||||
web_services = ['http', 'https', 'ssl', 'http-proxy', 'https-alt',
|
||||
'http-alt', 'ssl/http', 'ssl/https']
|
||||
service_name = service.get('service', '').lower()
|
||||
|
||||
if service_name in web_services:
|
||||
return True
|
||||
|
||||
# Check common non-standard web ports
|
||||
web_ports = [80, 443, 8000, 8006, 8008, 8080, 8081, 8443, 8888, 9443]
|
||||
# If no IP provided, can't do HTTP probe
|
||||
port = service.get('port')
|
||||
if not ip or not port:
|
||||
# check just the service if no IP - honestly shouldn't get here, but just incase...
|
||||
if service_name in web_services:
|
||||
return True
|
||||
return False
|
||||
|
||||
return port in web_ports
|
||||
# Actually try to connect - this is the definitive test
|
||||
# Try HTTPS first, then HTTP
|
||||
for protocol in ['https', 'http']:
|
||||
url = f"{protocol}://{ip}:{port}/"
|
||||
try:
|
||||
response = requests.get(
|
||||
url,
|
||||
timeout=3,
|
||||
verify=False,
|
||||
allow_redirects=False
|
||||
)
|
||||
# Any status code means it's a web server
|
||||
# (including 404, 500, etc. - still a web server)
|
||||
return True
|
||||
except requests.exceptions.SSLError:
|
||||
# SSL error on HTTPS, try HTTP next
|
||||
continue
|
||||
except (requests.exceptions.ConnectionError,
|
||||
requests.exceptions.Timeout,
|
||||
requests.exceptions.RequestException):
|
||||
continue
|
||||
|
||||
return False
|
||||
|
||||
def _detect_http_https(self, ip: str, port: int, timeout: int = 5) -> str:
|
||||
"""
|
||||
@@ -503,7 +914,7 @@ class SneakyScanner:
|
||||
ip_results = {}
|
||||
|
||||
for service in services:
|
||||
if not self._is_likely_web_service(service):
|
||||
if not self._is_likely_web_service(service, ip):
|
||||
continue
|
||||
|
||||
port = service['port']
|
||||
@@ -549,15 +960,25 @@ class SneakyScanner:
|
||||
|
||||
return all_results
|
||||
|
||||
def scan(self) -> Dict[str, Any]:
|
||||
def scan(self, progress_callback: Optional[Callable] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Perform complete scan based on configuration
|
||||
|
||||
Args:
|
||||
progress_callback: Optional callback function for progress updates.
|
||||
Called with (phase, ip, data) where:
|
||||
- phase: 'init', 'ping', 'tcp_scan', 'udp_scan', 'service_detection', 'http_analysis'
|
||||
- ip: IP address being processed (or None for phase start)
|
||||
- data: Dict with progress data (results, counts, etc.)
|
||||
|
||||
Returns:
|
||||
Dictionary containing scan results
|
||||
"""
|
||||
print(f"Starting scan: {self.config['title']}", flush=True)
|
||||
print(f"Config: {self.config_path}", flush=True)
|
||||
if self.config_id:
|
||||
print(f"Config ID: {self.config_id}", flush=True)
|
||||
elif self.config_path:
|
||||
print(f"Config: {self.config_path}", flush=True)
|
||||
|
||||
# Record start time
|
||||
start_time = time.time()
|
||||
@@ -586,17 +1007,61 @@ class SneakyScanner:
|
||||
all_ips = sorted(list(all_ips))
|
||||
print(f"Total IPs to scan: {len(all_ips)}", flush=True)
|
||||
|
||||
# Report initialization with total IP count
|
||||
if progress_callback:
|
||||
progress_callback('init', None, {
|
||||
'total_ips': len(all_ips),
|
||||
'ip_to_site': ip_to_site
|
||||
})
|
||||
|
||||
# Perform ping scan
|
||||
print(f"\n[1/5] Performing ping scan on {len(all_ips)} IPs...", flush=True)
|
||||
if progress_callback:
|
||||
progress_callback('ping', None, {'status': 'starting'})
|
||||
ping_results = self._run_ping_scan(all_ips)
|
||||
|
||||
# Check for cancellation
|
||||
if self.is_cancelled():
|
||||
print("\nScan cancelled by user", flush=True)
|
||||
raise ScanCancelledError("Scan cancelled by user")
|
||||
|
||||
# Report ping results
|
||||
if progress_callback:
|
||||
progress_callback('ping', None, {
|
||||
'status': 'completed',
|
||||
'results': ping_results
|
||||
})
|
||||
|
||||
# Perform TCP scan (all ports)
|
||||
print(f"\n[2/5] Performing TCP scan on {len(all_ips)} IPs (ports 0-65535)...", flush=True)
|
||||
if progress_callback:
|
||||
progress_callback('tcp_scan', None, {'status': 'starting'})
|
||||
tcp_results = self._run_masscan(all_ips, '0-65535', 'tcp')
|
||||
|
||||
# Perform UDP scan (all ports)
|
||||
print(f"\n[3/5] Performing UDP scan on {len(all_ips)} IPs (ports 0-65535)...", flush=True)
|
||||
udp_results = self._run_masscan(all_ips, '0-65535', 'udp')
|
||||
# Check for cancellation
|
||||
if self.is_cancelled():
|
||||
print("\nScan cancelled by user", flush=True)
|
||||
raise ScanCancelledError("Scan cancelled by user")
|
||||
|
||||
# Perform UDP scan (if enabled)
|
||||
udp_enabled = os.environ.get('UDP_SCAN_ENABLED', 'false').lower() == 'true'
|
||||
udp_ports = os.environ.get('UDP_PORTS', '53,67,68,69,123,161,500,514,1900')
|
||||
|
||||
if udp_enabled:
|
||||
print(f"\n[3/5] Performing UDP scan on {len(all_ips)} IPs (ports {udp_ports})...", flush=True)
|
||||
if progress_callback:
|
||||
progress_callback('udp_scan', None, {'status': 'starting'})
|
||||
udp_results = self._run_masscan(all_ips, udp_ports, 'udp')
|
||||
|
||||
# Check for cancellation
|
||||
if self.is_cancelled():
|
||||
print("\nScan cancelled by user", flush=True)
|
||||
raise ScanCancelledError("Scan cancelled by user")
|
||||
else:
|
||||
print(f"\n[3/5] Skipping UDP scan (disabled)...", flush=True)
|
||||
if progress_callback:
|
||||
progress_callback('udp_scan', None, {'status': 'skipped'})
|
||||
udp_results = []
|
||||
|
||||
# Organize results by IP
|
||||
results_by_ip = {}
|
||||
@@ -631,20 +1096,56 @@ class SneakyScanner:
|
||||
results_by_ip[ip]['actual']['tcp_ports'].sort()
|
||||
results_by_ip[ip]['actual']['udp_ports'].sort()
|
||||
|
||||
# Report TCP/UDP scan results with discovered ports per IP
|
||||
if progress_callback:
|
||||
tcp_udp_results = {}
|
||||
for ip in all_ips:
|
||||
tcp_udp_results[ip] = {
|
||||
'tcp_ports': results_by_ip[ip]['actual']['tcp_ports'],
|
||||
'udp_ports': results_by_ip[ip]['actual']['udp_ports']
|
||||
}
|
||||
progress_callback('tcp_scan', None, {
|
||||
'status': 'completed',
|
||||
'results': tcp_udp_results
|
||||
})
|
||||
|
||||
# Perform service detection on TCP ports
|
||||
print(f"\n[4/5] Performing service detection on discovered TCP ports...", flush=True)
|
||||
if progress_callback:
|
||||
progress_callback('service_detection', None, {'status': 'starting'})
|
||||
ip_ports = {ip: results_by_ip[ip]['actual']['tcp_ports'] for ip in all_ips}
|
||||
service_results = self._run_nmap_service_detection(ip_ports)
|
||||
|
||||
# Check for cancellation
|
||||
if self.is_cancelled():
|
||||
print("\nScan cancelled by user", flush=True)
|
||||
raise ScanCancelledError("Scan cancelled by user")
|
||||
|
||||
# Add service information to results
|
||||
for ip, services in service_results.items():
|
||||
if ip in results_by_ip:
|
||||
results_by_ip[ip]['actual']['services'] = services
|
||||
|
||||
# Report service detection results
|
||||
if progress_callback:
|
||||
progress_callback('service_detection', None, {
|
||||
'status': 'completed',
|
||||
'results': service_results
|
||||
})
|
||||
|
||||
# Perform HTTP/HTTPS analysis on web services
|
||||
print(f"\n[5/5] Analyzing HTTP/HTTPS services and SSL/TLS configuration...", flush=True)
|
||||
if progress_callback:
|
||||
progress_callback('http_analysis', None, {'status': 'starting'})
|
||||
http_results = self._run_http_analysis(service_results)
|
||||
|
||||
# Report HTTP analysis completion
|
||||
if progress_callback:
|
||||
progress_callback('http_analysis', None, {
|
||||
'status': 'completed',
|
||||
'results': http_results
|
||||
})
|
||||
|
||||
# Merge HTTP analysis into service results
|
||||
for ip, port_results in http_results.items():
|
||||
if ip in results_by_ip:
|
||||
@@ -662,7 +1163,7 @@ class SneakyScanner:
|
||||
'title': self.config['title'],
|
||||
'scan_time': datetime.utcnow().isoformat() + 'Z',
|
||||
'scan_duration': scan_duration,
|
||||
'config_file': str(self.config_path),
|
||||
'config_id': self.config_id,
|
||||
'sites': []
|
||||
}
|
||||
|
||||
@@ -768,6 +1269,8 @@ class SneakyScanner:
|
||||
# Preserve directory structure in ZIP
|
||||
arcname = f"{screenshot_dir.name}/{screenshot_file.name}"
|
||||
zipf.write(screenshot_file, arcname)
|
||||
# Track screenshot directory for database storage
|
||||
output_paths['screenshots'] = screenshot_dir
|
||||
|
||||
output_paths['zip'] = zip_path
|
||||
print(f"ZIP archive saved to: {zip_path}", flush=True)
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -490,8 +490,8 @@
|
||||
<div class="header-meta">
|
||||
<span>📅 <strong>Scan Time:</strong> {{ scan_time | format_date }}</span>
|
||||
<span>⏱️ <strong>Duration:</strong> {{ scan_duration | format_duration }}</span>
|
||||
{% if config_file %}
|
||||
<span>📄 <strong>Config:</strong> {{ config_file }}</span>
|
||||
{% if config_id %}
|
||||
<span>📄 <strong>Config ID:</strong> {{ config_id }}</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -13,7 +13,7 @@ from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
from web.app import create_app
|
||||
from web.models import Base, Scan
|
||||
from web.models import Base, Scan, ScanConfig
|
||||
from web.utils.settings import PasswordManager, SettingsManager
|
||||
|
||||
|
||||
@@ -53,7 +53,7 @@ def sample_scan_report():
|
||||
'title': 'Test Scan',
|
||||
'scan_time': '2025-11-14T10:30:00Z',
|
||||
'scan_duration': 125.5,
|
||||
'config_file': '/app/configs/test.yaml',
|
||||
'config_id': 1,
|
||||
'sites': [
|
||||
{
|
||||
'name': 'Test Site',
|
||||
@@ -199,6 +199,53 @@ def sample_invalid_config_file(tmp_path):
|
||||
return str(config_file)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def sample_db_config(db):
|
||||
"""
|
||||
Create a sample database config for testing.
|
||||
|
||||
Args:
|
||||
db: Database session fixture
|
||||
|
||||
Returns:
|
||||
ScanConfig model instance with ID
|
||||
"""
|
||||
import json
|
||||
|
||||
config_data = {
|
||||
'title': 'Test Scan',
|
||||
'sites': [
|
||||
{
|
||||
'name': 'Test Site',
|
||||
'ips': [
|
||||
{
|
||||
'address': '192.168.1.10',
|
||||
'expected': {
|
||||
'ping': True,
|
||||
'tcp_ports': [22, 80, 443],
|
||||
'udp_ports': [53],
|
||||
'services': ['ssh', 'http', 'https']
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
scan_config = ScanConfig(
|
||||
title='Test Scan',
|
||||
config_data=json.dumps(config_data),
|
||||
created_at=datetime.utcnow(),
|
||||
updated_at=datetime.utcnow()
|
||||
)
|
||||
|
||||
db.add(scan_config)
|
||||
db.commit()
|
||||
db.refresh(scan_config)
|
||||
|
||||
return scan_config
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def app():
|
||||
"""
|
||||
@@ -269,7 +316,7 @@ def sample_scan(db):
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='completed',
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
title='Test Scan',
|
||||
duration=125.5,
|
||||
triggered_by='test',
|
||||
|
||||
@@ -23,12 +23,12 @@ class TestBackgroundJobs:
|
||||
assert app.scheduler.scheduler is not None
|
||||
assert app.scheduler.scheduler.running
|
||||
|
||||
def test_queue_scan_job(self, app, db, sample_config_file):
|
||||
def test_queue_scan_job(self, app, db, sample_db_config):
|
||||
"""Test queuing a scan for background execution."""
|
||||
# Create a scan via service
|
||||
scan_service = ScanService(db)
|
||||
scan_id = scan_service.trigger_scan(
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
triggered_by='test',
|
||||
scheduler=app.scheduler
|
||||
)
|
||||
@@ -43,12 +43,12 @@ class TestBackgroundJobs:
|
||||
assert job is not None
|
||||
assert job.id == f'scan_{scan_id}'
|
||||
|
||||
def test_trigger_scan_without_scheduler(self, db, sample_config_file):
|
||||
def test_trigger_scan_without_scheduler(self, db, sample_db_config):
|
||||
"""Test triggering scan without scheduler logs warning."""
|
||||
# Create scan without scheduler
|
||||
scan_service = ScanService(db)
|
||||
scan_id = scan_service.trigger_scan(
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
triggered_by='test',
|
||||
scheduler=None # No scheduler
|
||||
)
|
||||
@@ -58,13 +58,13 @@ class TestBackgroundJobs:
|
||||
assert scan is not None
|
||||
assert scan.status == 'running'
|
||||
|
||||
def test_scheduler_service_queue_scan(self, app, db, sample_config_file):
|
||||
def test_scheduler_service_queue_scan(self, app, db, sample_db_config):
|
||||
"""Test SchedulerService.queue_scan directly."""
|
||||
# Create scan record first
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='running',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title='Test Scan',
|
||||
triggered_by='test'
|
||||
)
|
||||
@@ -72,27 +72,27 @@ class TestBackgroundJobs:
|
||||
db.commit()
|
||||
|
||||
# Queue the scan
|
||||
job_id = app.scheduler.queue_scan(scan.id, sample_config_file)
|
||||
job_id = app.scheduler.queue_scan(scan.id, sample_db_config)
|
||||
|
||||
# Verify job was queued
|
||||
assert job_id == f'scan_{scan.id}'
|
||||
job = app.scheduler.scheduler.get_job(job_id)
|
||||
assert job is not None
|
||||
|
||||
def test_scheduler_list_jobs(self, app, db, sample_config_file):
|
||||
def test_scheduler_list_jobs(self, app, db, sample_db_config):
|
||||
"""Test listing scheduled jobs."""
|
||||
# Queue a few scans
|
||||
for i in range(3):
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='running',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title=f'Test Scan {i}',
|
||||
triggered_by='test'
|
||||
)
|
||||
db.add(scan)
|
||||
db.commit()
|
||||
app.scheduler.queue_scan(scan.id, sample_config_file)
|
||||
app.scheduler.queue_scan(scan.id, sample_db_config)
|
||||
|
||||
# List jobs
|
||||
jobs = app.scheduler.list_jobs()
|
||||
@@ -106,20 +106,20 @@ class TestBackgroundJobs:
|
||||
assert 'name' in job
|
||||
assert 'trigger' in job
|
||||
|
||||
def test_scheduler_get_job_status(self, app, db, sample_config_file):
|
||||
def test_scheduler_get_job_status(self, app, db, sample_db_config):
|
||||
"""Test getting status of a specific job."""
|
||||
# Create and queue a scan
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='running',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title='Test Scan',
|
||||
triggered_by='test'
|
||||
)
|
||||
db.add(scan)
|
||||
db.commit()
|
||||
|
||||
job_id = app.scheduler.queue_scan(scan.id, sample_config_file)
|
||||
job_id = app.scheduler.queue_scan(scan.id, sample_db_config)
|
||||
|
||||
# Get job status
|
||||
status = app.scheduler.get_job_status(job_id)
|
||||
@@ -133,13 +133,13 @@ class TestBackgroundJobs:
|
||||
status = app.scheduler.get_job_status('nonexistent_job_id')
|
||||
assert status is None
|
||||
|
||||
def test_scan_timing_fields(self, db, sample_config_file):
|
||||
def test_scan_timing_fields(self, db, sample_db_config):
|
||||
"""Test that scan timing fields are properly set."""
|
||||
# Create scan with started_at
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='running',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title='Test Scan',
|
||||
triggered_by='test',
|
||||
started_at=datetime.utcnow()
|
||||
@@ -161,13 +161,13 @@ class TestBackgroundJobs:
|
||||
assert scan.completed_at is not None
|
||||
assert (scan.completed_at - scan.started_at).total_seconds() >= 0
|
||||
|
||||
def test_scan_error_handling(self, db, sample_config_file):
|
||||
def test_scan_error_handling(self, db, sample_db_config):
|
||||
"""Test that error messages are stored correctly."""
|
||||
# Create failed scan
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='failed',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title='Failed Scan',
|
||||
triggered_by='test',
|
||||
started_at=datetime.utcnow(),
|
||||
@@ -188,7 +188,7 @@ class TestBackgroundJobs:
|
||||
assert status['error_message'] == 'Test error message'
|
||||
|
||||
@pytest.mark.skip(reason="Requires actual scanner execution - slow test")
|
||||
def test_background_scan_execution(self, app, db, sample_config_file):
|
||||
def test_background_scan_execution(self, app, db, sample_db_config):
|
||||
"""
|
||||
Integration test for actual background scan execution.
|
||||
|
||||
@@ -200,7 +200,7 @@ class TestBackgroundJobs:
|
||||
# Trigger scan
|
||||
scan_service = ScanService(db)
|
||||
scan_id = scan_service.trigger_scan(
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
triggered_by='test',
|
||||
scheduler=app.scheduler
|
||||
)
|
||||
|
||||
@@ -37,14 +37,14 @@ class TestScanAPIEndpoints:
|
||||
assert len(data['scans']) == 1
|
||||
assert data['scans'][0]['id'] == sample_scan.id
|
||||
|
||||
def test_list_scans_pagination(self, client, db):
|
||||
def test_list_scans_pagination(self, client, db, sample_db_config):
|
||||
"""Test scan list pagination."""
|
||||
# Create 25 scans
|
||||
for i in range(25):
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='completed',
|
||||
config_file=f'/app/configs/test{i}.yaml',
|
||||
config_id=sample_db_config.id,
|
||||
title=f'Test Scan {i}',
|
||||
triggered_by='test'
|
||||
)
|
||||
@@ -81,7 +81,7 @@ class TestScanAPIEndpoints:
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status=status,
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
title=f'{status.capitalize()} Scan',
|
||||
triggered_by='test'
|
||||
)
|
||||
@@ -123,10 +123,10 @@ class TestScanAPIEndpoints:
|
||||
assert 'error' in data
|
||||
assert data['error'] == 'Not found'
|
||||
|
||||
def test_trigger_scan_success(self, client, db, sample_config_file):
|
||||
def test_trigger_scan_success(self, client, db, sample_db_config):
|
||||
"""Test triggering a new scan."""
|
||||
response = client.post('/api/scans',
|
||||
json={'config_file': str(sample_config_file)},
|
||||
json={'config_id': sample_db_config.id},
|
||||
content_type='application/json'
|
||||
)
|
||||
assert response.status_code == 201
|
||||
@@ -142,8 +142,8 @@ class TestScanAPIEndpoints:
|
||||
assert scan.status == 'running'
|
||||
assert scan.triggered_by == 'api'
|
||||
|
||||
def test_trigger_scan_missing_config_file(self, client, db):
|
||||
"""Test triggering scan without config_file."""
|
||||
def test_trigger_scan_missing_config_id(self, client, db):
|
||||
"""Test triggering scan without config_id."""
|
||||
response = client.post('/api/scans',
|
||||
json={},
|
||||
content_type='application/json'
|
||||
@@ -152,12 +152,12 @@ class TestScanAPIEndpoints:
|
||||
|
||||
data = json.loads(response.data)
|
||||
assert 'error' in data
|
||||
assert 'config_file is required' in data['message']
|
||||
assert 'config_id is required' in data['message']
|
||||
|
||||
def test_trigger_scan_invalid_config_file(self, client, db):
|
||||
"""Test triggering scan with non-existent config file."""
|
||||
def test_trigger_scan_invalid_config_id(self, client, db):
|
||||
"""Test triggering scan with non-existent config."""
|
||||
response = client.post('/api/scans',
|
||||
json={'config_file': '/nonexistent/config.yaml'},
|
||||
json={'config_id': 99999},
|
||||
content_type='application/json'
|
||||
)
|
||||
assert response.status_code == 400
|
||||
@@ -222,7 +222,7 @@ class TestScanAPIEndpoints:
|
||||
assert 'error' in data
|
||||
assert 'message' in data
|
||||
|
||||
def test_scan_workflow_integration(self, client, db, sample_config_file):
|
||||
def test_scan_workflow_integration(self, client, db, sample_db_config):
|
||||
"""
|
||||
Test complete scan workflow: trigger → status → retrieve → delete.
|
||||
|
||||
@@ -231,7 +231,7 @@ class TestScanAPIEndpoints:
|
||||
"""
|
||||
# Step 1: Trigger scan
|
||||
response = client.post('/api/scans',
|
||||
json={'config_file': str(sample_config_file)},
|
||||
json={'config_id': sample_db_config.id},
|
||||
content_type='application/json'
|
||||
)
|
||||
assert response.status_code == 201
|
||||
|
||||
@@ -17,10 +17,10 @@ class TestScanComparison:
|
||||
"""Tests for scan comparison methods."""
|
||||
|
||||
@pytest.fixture
|
||||
def scan1_data(self, test_db, sample_config_file):
|
||||
def scan1_data(self, test_db, sample_db_config):
|
||||
"""Create first scan with test data."""
|
||||
service = ScanService(test_db)
|
||||
scan_id = service.trigger_scan(sample_config_file, triggered_by='manual')
|
||||
scan_id = service.trigger_scan(sample_db_config, triggered_by='manual')
|
||||
|
||||
# Get scan and add some test data
|
||||
scan = test_db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
@@ -77,10 +77,10 @@ class TestScanComparison:
|
||||
return scan_id
|
||||
|
||||
@pytest.fixture
|
||||
def scan2_data(self, test_db, sample_config_file):
|
||||
def scan2_data(self, test_db, sample_db_config):
|
||||
"""Create second scan with modified test data."""
|
||||
service = ScanService(test_db)
|
||||
scan_id = service.trigger_scan(sample_config_file, triggered_by='manual')
|
||||
scan_id = service.trigger_scan(sample_db_config, triggered_by='manual')
|
||||
|
||||
# Get scan and add some test data
|
||||
scan = test_db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
|
||||
@@ -13,49 +13,42 @@ from web.services.scan_service import ScanService
|
||||
class TestScanServiceTrigger:
|
||||
"""Tests for triggering scans."""
|
||||
|
||||
def test_trigger_scan_valid_config(self, test_db, sample_config_file):
|
||||
"""Test triggering a scan with valid config file."""
|
||||
service = ScanService(test_db)
|
||||
def test_trigger_scan_valid_config(self, db, sample_db_config):
|
||||
"""Test triggering a scan with valid config."""
|
||||
service = ScanService(db)
|
||||
|
||||
scan_id = service.trigger_scan(sample_config_file, triggered_by='manual')
|
||||
scan_id = service.trigger_scan(config_id=sample_db_config.id, triggered_by='manual')
|
||||
|
||||
# Verify scan created
|
||||
assert scan_id is not None
|
||||
assert isinstance(scan_id, int)
|
||||
|
||||
# Verify scan in database
|
||||
scan = test_db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
scan = db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
assert scan is not None
|
||||
assert scan.status == 'running'
|
||||
assert scan.title == 'Test Scan'
|
||||
assert scan.triggered_by == 'manual'
|
||||
assert scan.config_file == sample_config_file
|
||||
assert scan.config_id == sample_db_config.id
|
||||
|
||||
def test_trigger_scan_invalid_config(self, test_db, sample_invalid_config_file):
|
||||
"""Test triggering a scan with invalid config file."""
|
||||
service = ScanService(test_db)
|
||||
def test_trigger_scan_invalid_config(self, db):
|
||||
"""Test triggering a scan with invalid config ID."""
|
||||
service = ScanService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="Invalid config file"):
|
||||
service.trigger_scan(sample_invalid_config_file)
|
||||
with pytest.raises(ValueError, match="not found"):
|
||||
service.trigger_scan(config_id=99999)
|
||||
|
||||
def test_trigger_scan_nonexistent_file(self, test_db):
|
||||
"""Test triggering a scan with nonexistent config file."""
|
||||
service = ScanService(test_db)
|
||||
|
||||
with pytest.raises(ValueError, match="does not exist"):
|
||||
service.trigger_scan('/nonexistent/config.yaml')
|
||||
|
||||
def test_trigger_scan_with_schedule(self, test_db, sample_config_file):
|
||||
def test_trigger_scan_with_schedule(self, db, sample_db_config):
|
||||
"""Test triggering a scan via schedule."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
scan_id = service.trigger_scan(
|
||||
sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
triggered_by='scheduled',
|
||||
schedule_id=42
|
||||
)
|
||||
|
||||
scan = test_db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
scan = db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
assert scan.triggered_by == 'scheduled'
|
||||
assert scan.schedule_id == 42
|
||||
|
||||
@@ -63,19 +56,19 @@ class TestScanServiceTrigger:
|
||||
class TestScanServiceGet:
|
||||
"""Tests for retrieving scans."""
|
||||
|
||||
def test_get_scan_not_found(self, test_db):
|
||||
def test_get_scan_not_found(self, db):
|
||||
"""Test getting a nonexistent scan."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
result = service.get_scan(999)
|
||||
assert result is None
|
||||
|
||||
def test_get_scan_found(self, test_db, sample_config_file):
|
||||
def test_get_scan_found(self, db, sample_db_config):
|
||||
"""Test getting an existing scan."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
# Create a scan
|
||||
scan_id = service.trigger_scan(sample_config_file)
|
||||
scan_id = service.trigger_scan(config_id=sample_db_config.id)
|
||||
|
||||
# Retrieve it
|
||||
result = service.get_scan(scan_id)
|
||||
@@ -90,9 +83,9 @@ class TestScanServiceGet:
|
||||
class TestScanServiceList:
|
||||
"""Tests for listing scans."""
|
||||
|
||||
def test_list_scans_empty(self, test_db):
|
||||
def test_list_scans_empty(self, db):
|
||||
"""Test listing scans when database is empty."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
result = service.list_scans(page=1, per_page=20)
|
||||
|
||||
@@ -100,13 +93,13 @@ class TestScanServiceList:
|
||||
assert len(result.items) == 0
|
||||
assert result.pages == 0
|
||||
|
||||
def test_list_scans_with_data(self, test_db, sample_config_file):
|
||||
def test_list_scans_with_data(self, db, sample_db_config):
|
||||
"""Test listing scans with multiple scans."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
# Create 3 scans
|
||||
for i in range(3):
|
||||
service.trigger_scan(sample_config_file, triggered_by='api')
|
||||
service.trigger_scan(config_id=sample_db_config.id, triggered_by='api')
|
||||
|
||||
# List all scans
|
||||
result = service.list_scans(page=1, per_page=20)
|
||||
@@ -115,13 +108,13 @@ class TestScanServiceList:
|
||||
assert len(result.items) == 3
|
||||
assert result.pages == 1
|
||||
|
||||
def test_list_scans_pagination(self, test_db, sample_config_file):
|
||||
def test_list_scans_pagination(self, db, sample_db_config):
|
||||
"""Test pagination."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
# Create 5 scans
|
||||
for i in range(5):
|
||||
service.trigger_scan(sample_config_file)
|
||||
service.trigger_scan(config_id=sample_db_config.id)
|
||||
|
||||
# Get page 1 (2 items per page)
|
||||
result = service.list_scans(page=1, per_page=2)
|
||||
@@ -141,18 +134,18 @@ class TestScanServiceList:
|
||||
assert len(result.items) == 1
|
||||
assert result.has_next is False
|
||||
|
||||
def test_list_scans_filter_by_status(self, test_db, sample_config_file):
|
||||
def test_list_scans_filter_by_status(self, db, sample_db_config):
|
||||
"""Test filtering scans by status."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
# Create scans with different statuses
|
||||
scan_id_1 = service.trigger_scan(sample_config_file)
|
||||
scan_id_2 = service.trigger_scan(sample_config_file)
|
||||
scan_id_1 = service.trigger_scan(config_id=sample_db_config.id)
|
||||
scan_id_2 = service.trigger_scan(config_id=sample_db_config.id)
|
||||
|
||||
# Mark one as completed
|
||||
scan = test_db.query(Scan).filter(Scan.id == scan_id_1).first()
|
||||
scan = db.query(Scan).filter(Scan.id == scan_id_1).first()
|
||||
scan.status = 'completed'
|
||||
test_db.commit()
|
||||
db.commit()
|
||||
|
||||
# Filter by running
|
||||
result = service.list_scans(status_filter='running')
|
||||
@@ -162,9 +155,9 @@ class TestScanServiceList:
|
||||
result = service.list_scans(status_filter='completed')
|
||||
assert result.total == 1
|
||||
|
||||
def test_list_scans_invalid_status_filter(self, test_db):
|
||||
def test_list_scans_invalid_status_filter(self, db):
|
||||
"""Test filtering with invalid status."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="Invalid status"):
|
||||
service.list_scans(status_filter='invalid_status')
|
||||
@@ -173,46 +166,46 @@ class TestScanServiceList:
|
||||
class TestScanServiceDelete:
|
||||
"""Tests for deleting scans."""
|
||||
|
||||
def test_delete_scan_not_found(self, test_db):
|
||||
def test_delete_scan_not_found(self, db):
|
||||
"""Test deleting a nonexistent scan."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="not found"):
|
||||
service.delete_scan(999)
|
||||
|
||||
def test_delete_scan_success(self, test_db, sample_config_file):
|
||||
def test_delete_scan_success(self, db, sample_db_config):
|
||||
"""Test successful scan deletion."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
# Create a scan
|
||||
scan_id = service.trigger_scan(sample_config_file)
|
||||
scan_id = service.trigger_scan(config_id=sample_db_config.id)
|
||||
|
||||
# Verify it exists
|
||||
assert test_db.query(Scan).filter(Scan.id == scan_id).first() is not None
|
||||
assert db.query(Scan).filter(Scan.id == scan_id).first() is not None
|
||||
|
||||
# Delete it
|
||||
result = service.delete_scan(scan_id)
|
||||
assert result is True
|
||||
|
||||
# Verify it's gone
|
||||
assert test_db.query(Scan).filter(Scan.id == scan_id).first() is None
|
||||
assert db.query(Scan).filter(Scan.id == scan_id).first() is None
|
||||
|
||||
|
||||
class TestScanServiceStatus:
|
||||
"""Tests for scan status retrieval."""
|
||||
|
||||
def test_get_scan_status_not_found(self, test_db):
|
||||
def test_get_scan_status_not_found(self, db):
|
||||
"""Test getting status of nonexistent scan."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
result = service.get_scan_status(999)
|
||||
assert result is None
|
||||
|
||||
def test_get_scan_status_running(self, test_db, sample_config_file):
|
||||
def test_get_scan_status_running(self, db, sample_db_config):
|
||||
"""Test getting status of running scan."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
scan_id = service.trigger_scan(sample_config_file)
|
||||
scan_id = service.trigger_scan(config_id=sample_db_config.id)
|
||||
status = service.get_scan_status(scan_id)
|
||||
|
||||
assert status is not None
|
||||
@@ -221,16 +214,16 @@ class TestScanServiceStatus:
|
||||
assert status['progress'] == 'In progress'
|
||||
assert status['title'] == 'Test Scan'
|
||||
|
||||
def test_get_scan_status_completed(self, test_db, sample_config_file):
|
||||
def test_get_scan_status_completed(self, db, sample_db_config):
|
||||
"""Test getting status of completed scan."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
# Create and mark as completed
|
||||
scan_id = service.trigger_scan(sample_config_file)
|
||||
scan = test_db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
scan_id = service.trigger_scan(config_id=sample_db_config.id)
|
||||
scan = db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
scan.status = 'completed'
|
||||
scan.duration = 125.5
|
||||
test_db.commit()
|
||||
db.commit()
|
||||
|
||||
status = service.get_scan_status(scan_id)
|
||||
|
||||
@@ -242,35 +235,35 @@ class TestScanServiceStatus:
|
||||
class TestScanServiceDatabaseMapping:
|
||||
"""Tests for mapping scan reports to database models."""
|
||||
|
||||
def test_save_scan_to_db(self, test_db, sample_config_file, sample_scan_report):
|
||||
def test_save_scan_to_db(self, db, sample_db_config, sample_scan_report):
|
||||
"""Test saving a complete scan report to database."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
# Create a scan
|
||||
scan_id = service.trigger_scan(sample_config_file)
|
||||
scan_id = service.trigger_scan(config_id=sample_db_config.id)
|
||||
|
||||
# Save report to database
|
||||
service._save_scan_to_db(sample_scan_report, scan_id, status='completed')
|
||||
|
||||
# Verify scan updated
|
||||
scan = test_db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
scan = db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
assert scan.status == 'completed'
|
||||
assert scan.duration == 125.5
|
||||
|
||||
# Verify sites created
|
||||
sites = test_db.query(ScanSite).filter(ScanSite.scan_id == scan_id).all()
|
||||
sites = db.query(ScanSite).filter(ScanSite.scan_id == scan_id).all()
|
||||
assert len(sites) == 1
|
||||
assert sites[0].site_name == 'Test Site'
|
||||
|
||||
# Verify IPs created
|
||||
ips = test_db.query(ScanIP).filter(ScanIP.scan_id == scan_id).all()
|
||||
ips = db.query(ScanIP).filter(ScanIP.scan_id == scan_id).all()
|
||||
assert len(ips) == 1
|
||||
assert ips[0].ip_address == '192.168.1.10'
|
||||
assert ips[0].ping_expected is True
|
||||
assert ips[0].ping_actual is True
|
||||
|
||||
# Verify ports created (TCP: 22, 80, 443, 8080 | UDP: 53)
|
||||
ports = test_db.query(ScanPort).filter(ScanPort.scan_id == scan_id).all()
|
||||
ports = db.query(ScanPort).filter(ScanPort.scan_id == scan_id).all()
|
||||
assert len(ports) == 5 # 4 TCP + 1 UDP
|
||||
|
||||
# Verify TCP ports
|
||||
@@ -285,7 +278,7 @@ class TestScanServiceDatabaseMapping:
|
||||
assert udp_ports[0].port == 53
|
||||
|
||||
# Verify services created
|
||||
services = test_db.query(ScanServiceModel).filter(
|
||||
services = db.query(ScanServiceModel).filter(
|
||||
ScanServiceModel.scan_id == scan_id
|
||||
).all()
|
||||
assert len(services) == 4 # SSH, HTTP (80), HTTPS, HTTP (8080)
|
||||
@@ -300,15 +293,15 @@ class TestScanServiceDatabaseMapping:
|
||||
assert https_service.http_protocol == 'https'
|
||||
assert https_service.screenshot_path == 'screenshots/192_168_1_10_443.png'
|
||||
|
||||
def test_map_port_expected_vs_actual(self, test_db, sample_config_file, sample_scan_report):
|
||||
def test_map_port_expected_vs_actual(self, db, sample_db_config, sample_scan_report):
|
||||
"""Test that expected vs actual ports are correctly flagged."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
scan_id = service.trigger_scan(sample_config_file)
|
||||
scan_id = service.trigger_scan(config_id=sample_db_config.id)
|
||||
service._save_scan_to_db(sample_scan_report, scan_id)
|
||||
|
||||
# Check TCP ports
|
||||
tcp_ports = test_db.query(ScanPort).filter(
|
||||
tcp_ports = db.query(ScanPort).filter(
|
||||
ScanPort.scan_id == scan_id,
|
||||
ScanPort.protocol == 'tcp'
|
||||
).all()
|
||||
@@ -322,15 +315,15 @@ class TestScanServiceDatabaseMapping:
|
||||
# Port 8080 was not expected
|
||||
assert port.expected is False, f"Port {port.port} should not be expected"
|
||||
|
||||
def test_map_certificate_and_tls(self, test_db, sample_config_file, sample_scan_report):
|
||||
def test_map_certificate_and_tls(self, db, sample_db_config, sample_scan_report):
|
||||
"""Test that certificate and TLS data are correctly mapped."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
scan_id = service.trigger_scan(sample_config_file)
|
||||
scan_id = service.trigger_scan(config_id=sample_db_config.id)
|
||||
service._save_scan_to_db(sample_scan_report, scan_id)
|
||||
|
||||
# Find HTTPS service
|
||||
https_service = test_db.query(ScanServiceModel).filter(
|
||||
https_service = db.query(ScanServiceModel).filter(
|
||||
ScanServiceModel.scan_id == scan_id,
|
||||
ScanServiceModel.service_name == 'https'
|
||||
).first()
|
||||
@@ -363,11 +356,11 @@ class TestScanServiceDatabaseMapping:
|
||||
assert tls_13 is not None
|
||||
assert tls_13.supported is True
|
||||
|
||||
def test_get_scan_with_full_details(self, test_db, sample_config_file, sample_scan_report):
|
||||
def test_get_scan_with_full_details(self, db, sample_db_config, sample_scan_report):
|
||||
"""Test retrieving scan with all nested relationships."""
|
||||
service = ScanService(test_db)
|
||||
service = ScanService(db)
|
||||
|
||||
scan_id = service.trigger_scan(sample_config_file)
|
||||
scan_id = service.trigger_scan(config_id=sample_db_config.id)
|
||||
service._save_scan_to_db(sample_scan_report, scan_id)
|
||||
|
||||
# Get full scan details
|
||||
|
||||
@@ -13,20 +13,20 @@ from web.models import Schedule, Scan
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def sample_schedule(db, sample_config_file):
|
||||
def sample_schedule(db, sample_db_config):
|
||||
"""
|
||||
Create a sample schedule in the database for testing.
|
||||
|
||||
Args:
|
||||
db: Database session fixture
|
||||
sample_config_file: Path to test config file
|
||||
sample_db_config: Path to test config file
|
||||
|
||||
Returns:
|
||||
Schedule model instance
|
||||
"""
|
||||
schedule = Schedule(
|
||||
name='Daily Test Scan',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True,
|
||||
last_run=None,
|
||||
@@ -68,13 +68,13 @@ class TestScheduleAPIEndpoints:
|
||||
assert data['schedules'][0]['name'] == sample_schedule.name
|
||||
assert data['schedules'][0]['cron_expression'] == sample_schedule.cron_expression
|
||||
|
||||
def test_list_schedules_pagination(self, client, db, sample_config_file):
|
||||
def test_list_schedules_pagination(self, client, db, sample_db_config):
|
||||
"""Test schedule list pagination."""
|
||||
# Create 25 schedules
|
||||
for i in range(25):
|
||||
schedule = Schedule(
|
||||
name=f'Schedule {i}',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True,
|
||||
created_at=datetime.utcnow()
|
||||
@@ -101,13 +101,13 @@ class TestScheduleAPIEndpoints:
|
||||
assert len(data['schedules']) == 10
|
||||
assert data['page'] == 2
|
||||
|
||||
def test_list_schedules_filter_enabled(self, client, db, sample_config_file):
|
||||
def test_list_schedules_filter_enabled(self, client, db, sample_db_config):
|
||||
"""Test filtering schedules by enabled status."""
|
||||
# Create enabled and disabled schedules
|
||||
for i in range(3):
|
||||
schedule = Schedule(
|
||||
name=f'Enabled Schedule {i}',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True,
|
||||
created_at=datetime.utcnow()
|
||||
@@ -117,7 +117,7 @@ class TestScheduleAPIEndpoints:
|
||||
for i in range(2):
|
||||
schedule = Schedule(
|
||||
name=f'Disabled Schedule {i}',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 3 * * *',
|
||||
enabled=False,
|
||||
created_at=datetime.utcnow()
|
||||
@@ -151,7 +151,7 @@ class TestScheduleAPIEndpoints:
|
||||
data = json.loads(response.data)
|
||||
assert data['id'] == sample_schedule.id
|
||||
assert data['name'] == sample_schedule.name
|
||||
assert data['config_file'] == sample_schedule.config_file
|
||||
assert data['config_id'] == sample_schedule.config_id
|
||||
assert data['cron_expression'] == sample_schedule.cron_expression
|
||||
assert data['enabled'] == sample_schedule.enabled
|
||||
assert 'history' in data
|
||||
@@ -165,11 +165,11 @@ class TestScheduleAPIEndpoints:
|
||||
assert 'error' in data
|
||||
assert 'not found' in data['error'].lower()
|
||||
|
||||
def test_create_schedule(self, client, db, sample_config_file):
|
||||
def test_create_schedule(self, client, db, sample_db_config):
|
||||
"""Test creating a new schedule."""
|
||||
schedule_data = {
|
||||
'name': 'New Test Schedule',
|
||||
'config_file': sample_config_file,
|
||||
'config_id': sample_db_config.id,
|
||||
'cron_expression': '0 3 * * *',
|
||||
'enabled': True
|
||||
}
|
||||
@@ -197,7 +197,7 @@ class TestScheduleAPIEndpoints:
|
||||
# Missing cron_expression
|
||||
schedule_data = {
|
||||
'name': 'Incomplete Schedule',
|
||||
'config_file': '/app/configs/test.yaml'
|
||||
'config_id': 1
|
||||
}
|
||||
|
||||
response = client.post(
|
||||
@@ -211,11 +211,11 @@ class TestScheduleAPIEndpoints:
|
||||
assert 'error' in data
|
||||
assert 'missing' in data['error'].lower()
|
||||
|
||||
def test_create_schedule_invalid_cron(self, client, db, sample_config_file):
|
||||
def test_create_schedule_invalid_cron(self, client, db, sample_db_config):
|
||||
"""Test creating schedule with invalid cron expression."""
|
||||
schedule_data = {
|
||||
'name': 'Invalid Cron Schedule',
|
||||
'config_file': sample_config_file,
|
||||
'config_id': sample_db_config.id,
|
||||
'cron_expression': 'invalid cron'
|
||||
}
|
||||
|
||||
@@ -231,10 +231,10 @@ class TestScheduleAPIEndpoints:
|
||||
assert 'invalid' in data['error'].lower() or 'cron' in data['error'].lower()
|
||||
|
||||
def test_create_schedule_invalid_config(self, client, db):
|
||||
"""Test creating schedule with non-existent config file."""
|
||||
"""Test creating schedule with non-existent config."""
|
||||
schedule_data = {
|
||||
'name': 'Invalid Config Schedule',
|
||||
'config_file': '/nonexistent/config.yaml',
|
||||
'config_id': 99999,
|
||||
'cron_expression': '0 2 * * *'
|
||||
}
|
||||
|
||||
@@ -360,13 +360,13 @@ class TestScheduleAPIEndpoints:
|
||||
data = json.loads(response.data)
|
||||
assert 'error' in data
|
||||
|
||||
def test_delete_schedule_preserves_scans(self, client, db, sample_schedule, sample_config_file):
|
||||
def test_delete_schedule_preserves_scans(self, client, db, sample_schedule, sample_db_config):
|
||||
"""Test that deleting schedule preserves associated scans."""
|
||||
# Create a scan associated with the schedule
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='completed',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title='Test Scan',
|
||||
triggered_by='scheduled',
|
||||
schedule_id=sample_schedule.id
|
||||
@@ -399,7 +399,7 @@ class TestScheduleAPIEndpoints:
|
||||
assert scan is not None
|
||||
assert scan.triggered_by == 'manual'
|
||||
assert scan.schedule_id == sample_schedule.id
|
||||
assert scan.config_file == sample_schedule.config_file
|
||||
assert scan.config_id == sample_schedule.config_id
|
||||
|
||||
def test_trigger_schedule_not_found(self, client, db):
|
||||
"""Test triggering non-existent schedule."""
|
||||
@@ -409,14 +409,14 @@ class TestScheduleAPIEndpoints:
|
||||
data = json.loads(response.data)
|
||||
assert 'error' in data
|
||||
|
||||
def test_get_schedule_with_history(self, client, db, sample_schedule, sample_config_file):
|
||||
def test_get_schedule_with_history(self, client, db, sample_schedule, sample_db_config):
|
||||
"""Test getting schedule includes execution history."""
|
||||
# Create some scans for this schedule
|
||||
for i in range(5):
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='completed',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title=f'Scheduled Scan {i}',
|
||||
triggered_by='scheduled',
|
||||
schedule_id=sample_schedule.id
|
||||
@@ -431,12 +431,12 @@ class TestScheduleAPIEndpoints:
|
||||
assert 'history' in data
|
||||
assert len(data['history']) == 5
|
||||
|
||||
def test_schedule_workflow_integration(self, client, db, sample_config_file):
|
||||
def test_schedule_workflow_integration(self, client, db, sample_db_config):
|
||||
"""Test complete schedule workflow: create → update → trigger → delete."""
|
||||
# 1. Create schedule
|
||||
schedule_data = {
|
||||
'name': 'Integration Test Schedule',
|
||||
'config_file': sample_config_file,
|
||||
'config_id': sample_db_config.id,
|
||||
'cron_expression': '0 2 * * *',
|
||||
'enabled': True
|
||||
}
|
||||
@@ -482,14 +482,14 @@ class TestScheduleAPIEndpoints:
|
||||
scan = db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
assert scan is not None
|
||||
|
||||
def test_list_schedules_ordering(self, client, db, sample_config_file):
|
||||
def test_list_schedules_ordering(self, client, db, sample_db_config):
|
||||
"""Test that schedules are ordered by next_run time."""
|
||||
# Create schedules with different next_run times
|
||||
schedules = []
|
||||
for i in range(3):
|
||||
schedule = Schedule(
|
||||
name=f'Schedule {i}',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True,
|
||||
next_run=datetime(2025, 11, 15 + i, 2, 0, 0),
|
||||
@@ -501,7 +501,7 @@ class TestScheduleAPIEndpoints:
|
||||
# Create a disabled schedule (next_run is None)
|
||||
disabled_schedule = Schedule(
|
||||
name='Disabled Schedule',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 3 * * *',
|
||||
enabled=False,
|
||||
next_run=None,
|
||||
@@ -523,11 +523,11 @@ class TestScheduleAPIEndpoints:
|
||||
assert returned_schedules[2]['id'] == schedules[2].id
|
||||
assert returned_schedules[3]['id'] == disabled_schedule.id
|
||||
|
||||
def test_create_schedule_with_disabled(self, client, db, sample_config_file):
|
||||
def test_create_schedule_with_disabled(self, client, db, sample_db_config):
|
||||
"""Test creating a disabled schedule."""
|
||||
schedule_data = {
|
||||
'name': 'Disabled Schedule',
|
||||
'config_file': sample_config_file,
|
||||
'config_id': sample_db_config.id,
|
||||
'cron_expression': '0 2 * * *',
|
||||
'enabled': False
|
||||
}
|
||||
@@ -587,7 +587,7 @@ class TestScheduleAPIAuthentication:
|
||||
class TestScheduleAPICronValidation:
|
||||
"""Test suite for cron expression validation."""
|
||||
|
||||
def test_valid_cron_expressions(self, client, db, sample_config_file):
|
||||
def test_valid_cron_expressions(self, client, db, sample_db_config):
|
||||
"""Test various valid cron expressions."""
|
||||
valid_expressions = [
|
||||
'0 2 * * *', # Daily at 2am
|
||||
@@ -600,7 +600,7 @@ class TestScheduleAPICronValidation:
|
||||
for cron_expr in valid_expressions:
|
||||
schedule_data = {
|
||||
'name': f'Schedule for {cron_expr}',
|
||||
'config_file': sample_config_file,
|
||||
'config_id': sample_db_config.id,
|
||||
'cron_expression': cron_expr
|
||||
}
|
||||
|
||||
@@ -612,7 +612,7 @@ class TestScheduleAPICronValidation:
|
||||
assert response.status_code == 201, \
|
||||
f"Valid cron expression '{cron_expr}' should be accepted"
|
||||
|
||||
def test_invalid_cron_expressions(self, client, db, sample_config_file):
|
||||
def test_invalid_cron_expressions(self, client, db, sample_db_config):
|
||||
"""Test various invalid cron expressions."""
|
||||
invalid_expressions = [
|
||||
'invalid',
|
||||
@@ -626,7 +626,7 @@ class TestScheduleAPICronValidation:
|
||||
for cron_expr in invalid_expressions:
|
||||
schedule_data = {
|
||||
'name': f'Schedule for {cron_expr}',
|
||||
'config_file': sample_config_file,
|
||||
'config_id': sample_db_config.id,
|
||||
'cron_expression': cron_expr
|
||||
}
|
||||
|
||||
|
||||
@@ -15,13 +15,13 @@ from web.services.schedule_service import ScheduleService
|
||||
class TestScheduleServiceCreate:
|
||||
"""Tests for creating schedules."""
|
||||
|
||||
def test_create_schedule_valid(self, test_db, sample_config_file):
|
||||
def test_create_schedule_valid(self, db, sample_db_config):
|
||||
"""Test creating a schedule with valid parameters."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Daily Scan',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -31,57 +31,57 @@ class TestScheduleServiceCreate:
|
||||
assert isinstance(schedule_id, int)
|
||||
|
||||
# Verify schedule in database
|
||||
schedule = test_db.query(Schedule).filter(Schedule.id == schedule_id).first()
|
||||
schedule = db.query(Schedule).filter(Schedule.id == schedule_id).first()
|
||||
assert schedule is not None
|
||||
assert schedule.name == 'Daily Scan'
|
||||
assert schedule.config_file == sample_config_file
|
||||
assert schedule.config_id == sample_db_config.id
|
||||
assert schedule.cron_expression == '0 2 * * *'
|
||||
assert schedule.enabled is True
|
||||
assert schedule.next_run is not None
|
||||
assert schedule.last_run is None
|
||||
|
||||
def test_create_schedule_disabled(self, test_db, sample_config_file):
|
||||
def test_create_schedule_disabled(self, db, sample_db_config):
|
||||
"""Test creating a disabled schedule."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Disabled Scan',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 3 * * *',
|
||||
enabled=False
|
||||
)
|
||||
|
||||
schedule = test_db.query(Schedule).filter(Schedule.id == schedule_id).first()
|
||||
schedule = db.query(Schedule).filter(Schedule.id == schedule_id).first()
|
||||
assert schedule.enabled is False
|
||||
assert schedule.next_run is None
|
||||
|
||||
def test_create_schedule_invalid_cron(self, test_db, sample_config_file):
|
||||
def test_create_schedule_invalid_cron(self, db, sample_db_config):
|
||||
"""Test creating a schedule with invalid cron expression."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="Invalid cron expression"):
|
||||
service.create_schedule(
|
||||
name='Invalid Schedule',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='invalid cron',
|
||||
enabled=True
|
||||
)
|
||||
|
||||
def test_create_schedule_nonexistent_config(self, test_db):
|
||||
"""Test creating a schedule with nonexistent config file."""
|
||||
service = ScheduleService(test_db)
|
||||
def test_create_schedule_nonexistent_config(self, db):
|
||||
"""Test creating a schedule with nonexistent config."""
|
||||
service = ScheduleService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="Config file not found"):
|
||||
with pytest.raises(ValueError, match="not found"):
|
||||
service.create_schedule(
|
||||
name='Bad Config',
|
||||
config_file='/nonexistent/config.yaml',
|
||||
config_id=99999,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
|
||||
def test_create_schedule_various_cron_expressions(self, test_db, sample_config_file):
|
||||
def test_create_schedule_various_cron_expressions(self, db, sample_db_config):
|
||||
"""Test creating schedules with various valid cron expressions."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
cron_expressions = [
|
||||
'0 0 * * *', # Daily at midnight
|
||||
@@ -94,7 +94,7 @@ class TestScheduleServiceCreate:
|
||||
for i, cron in enumerate(cron_expressions):
|
||||
schedule_id = service.create_schedule(
|
||||
name=f'Schedule {i}',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression=cron,
|
||||
enabled=True
|
||||
)
|
||||
@@ -104,21 +104,21 @@ class TestScheduleServiceCreate:
|
||||
class TestScheduleServiceGet:
|
||||
"""Tests for retrieving schedules."""
|
||||
|
||||
def test_get_schedule_not_found(self, test_db):
|
||||
def test_get_schedule_not_found(self, db):
|
||||
"""Test getting a nonexistent schedule."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="Schedule .* not found"):
|
||||
service.get_schedule(999)
|
||||
|
||||
def test_get_schedule_found(self, test_db, sample_config_file):
|
||||
def test_get_schedule_found(self, db, sample_db_config):
|
||||
"""Test getting an existing schedule."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
# Create a schedule
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test Schedule',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -134,14 +134,14 @@ class TestScheduleServiceGet:
|
||||
assert 'history' in result
|
||||
assert isinstance(result['history'], list)
|
||||
|
||||
def test_get_schedule_with_history(self, test_db, sample_config_file):
|
||||
def test_get_schedule_with_history(self, db, sample_db_config):
|
||||
"""Test getting schedule includes execution history."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
# Create schedule
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test Schedule',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -151,13 +151,13 @@ class TestScheduleServiceGet:
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow() - timedelta(days=i),
|
||||
status='completed',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title=f'Scan {i}',
|
||||
triggered_by='scheduled',
|
||||
schedule_id=schedule_id
|
||||
)
|
||||
test_db.add(scan)
|
||||
test_db.commit()
|
||||
db.add(scan)
|
||||
db.commit()
|
||||
|
||||
# Get schedule
|
||||
result = service.get_schedule(schedule_id)
|
||||
@@ -169,9 +169,9 @@ class TestScheduleServiceGet:
|
||||
class TestScheduleServiceList:
|
||||
"""Tests for listing schedules."""
|
||||
|
||||
def test_list_schedules_empty(self, test_db):
|
||||
def test_list_schedules_empty(self, db):
|
||||
"""Test listing schedules when database is empty."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
result = service.list_schedules(page=1, per_page=20)
|
||||
|
||||
@@ -180,15 +180,15 @@ class TestScheduleServiceList:
|
||||
assert result['page'] == 1
|
||||
assert result['per_page'] == 20
|
||||
|
||||
def test_list_schedules_populated(self, test_db, sample_config_file):
|
||||
def test_list_schedules_populated(self, db, sample_db_config):
|
||||
"""Test listing schedules with data."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
# Create multiple schedules
|
||||
for i in range(5):
|
||||
service.create_schedule(
|
||||
name=f'Schedule {i}',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -199,15 +199,15 @@ class TestScheduleServiceList:
|
||||
assert len(result['schedules']) == 5
|
||||
assert all('name' in s for s in result['schedules'])
|
||||
|
||||
def test_list_schedules_pagination(self, test_db, sample_config_file):
|
||||
def test_list_schedules_pagination(self, db, sample_db_config):
|
||||
"""Test schedule pagination."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
# Create 25 schedules
|
||||
for i in range(25):
|
||||
service.create_schedule(
|
||||
name=f'Schedule {i:02d}',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -226,22 +226,22 @@ class TestScheduleServiceList:
|
||||
result_page3 = service.list_schedules(page=3, per_page=10)
|
||||
assert len(result_page3['schedules']) == 5
|
||||
|
||||
def test_list_schedules_filter_enabled(self, test_db, sample_config_file):
|
||||
def test_list_schedules_filter_enabled(self, db, sample_db_config):
|
||||
"""Test filtering schedules by enabled status."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
# Create enabled and disabled schedules
|
||||
for i in range(3):
|
||||
service.create_schedule(
|
||||
name=f'Enabled {i}',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
for i in range(2):
|
||||
service.create_schedule(
|
||||
name=f'Disabled {i}',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=False
|
||||
)
|
||||
@@ -262,13 +262,13 @@ class TestScheduleServiceList:
|
||||
class TestScheduleServiceUpdate:
|
||||
"""Tests for updating schedules."""
|
||||
|
||||
def test_update_schedule_name(self, test_db, sample_config_file):
|
||||
def test_update_schedule_name(self, db, sample_db_config):
|
||||
"""Test updating schedule name."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Old Name',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -278,13 +278,13 @@ class TestScheduleServiceUpdate:
|
||||
assert result['name'] == 'New Name'
|
||||
assert result['cron_expression'] == '0 2 * * *'
|
||||
|
||||
def test_update_schedule_cron(self, test_db, sample_config_file):
|
||||
def test_update_schedule_cron(self, db, sample_db_config):
|
||||
"""Test updating cron expression recalculates next_run."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -302,13 +302,13 @@ class TestScheduleServiceUpdate:
|
||||
assert result['cron_expression'] == '0 3 * * *'
|
||||
assert result['next_run'] != original_next_run
|
||||
|
||||
def test_update_schedule_invalid_cron(self, test_db, sample_config_file):
|
||||
def test_update_schedule_invalid_cron(self, db, sample_db_config):
|
||||
"""Test updating with invalid cron expression fails."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -316,67 +316,67 @@ class TestScheduleServiceUpdate:
|
||||
with pytest.raises(ValueError, match="Invalid cron expression"):
|
||||
service.update_schedule(schedule_id, cron_expression='invalid')
|
||||
|
||||
def test_update_schedule_not_found(self, test_db):
|
||||
def test_update_schedule_not_found(self, db):
|
||||
"""Test updating nonexistent schedule fails."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="Schedule .* not found"):
|
||||
service.update_schedule(999, name='New Name')
|
||||
|
||||
def test_update_schedule_invalid_config_file(self, test_db, sample_config_file):
|
||||
"""Test updating with nonexistent config file fails."""
|
||||
service = ScheduleService(test_db)
|
||||
def test_update_schedule_invalid_config_id(self, db, sample_db_config):
|
||||
"""Test updating with nonexistent config ID fails."""
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
|
||||
with pytest.raises(ValueError, match="Config file not found"):
|
||||
service.update_schedule(schedule_id, config_file='/nonexistent.yaml')
|
||||
with pytest.raises(ValueError, match="not found"):
|
||||
service.update_schedule(schedule_id, config_id=99999)
|
||||
|
||||
|
||||
class TestScheduleServiceDelete:
|
||||
"""Tests for deleting schedules."""
|
||||
|
||||
def test_delete_schedule(self, test_db, sample_config_file):
|
||||
def test_delete_schedule(self, db, sample_db_config):
|
||||
"""Test deleting a schedule."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='To Delete',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
|
||||
# Verify exists
|
||||
assert test_db.query(Schedule).filter(Schedule.id == schedule_id).first() is not None
|
||||
assert db.query(Schedule).filter(Schedule.id == schedule_id).first() is not None
|
||||
|
||||
# Delete
|
||||
result = service.delete_schedule(schedule_id)
|
||||
assert result is True
|
||||
|
||||
# Verify deleted
|
||||
assert test_db.query(Schedule).filter(Schedule.id == schedule_id).first() is None
|
||||
assert db.query(Schedule).filter(Schedule.id == schedule_id).first() is None
|
||||
|
||||
def test_delete_schedule_not_found(self, test_db):
|
||||
def test_delete_schedule_not_found(self, db):
|
||||
"""Test deleting nonexistent schedule fails."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="Schedule .* not found"):
|
||||
service.delete_schedule(999)
|
||||
|
||||
def test_delete_schedule_preserves_scans(self, test_db, sample_config_file):
|
||||
def test_delete_schedule_preserves_scans(self, db, sample_db_config):
|
||||
"""Test that deleting schedule preserves associated scans."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
# Create schedule
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -385,20 +385,20 @@ class TestScheduleServiceDelete:
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='completed',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title='Test Scan',
|
||||
triggered_by='scheduled',
|
||||
schedule_id=schedule_id
|
||||
)
|
||||
test_db.add(scan)
|
||||
test_db.commit()
|
||||
db.add(scan)
|
||||
db.commit()
|
||||
scan_id = scan.id
|
||||
|
||||
# Delete schedule
|
||||
service.delete_schedule(schedule_id)
|
||||
|
||||
# Verify scan still exists (schedule_id becomes null)
|
||||
remaining_scan = test_db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
remaining_scan = db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
assert remaining_scan is not None
|
||||
assert remaining_scan.schedule_id is None
|
||||
|
||||
@@ -406,13 +406,13 @@ class TestScheduleServiceDelete:
|
||||
class TestScheduleServiceToggle:
|
||||
"""Tests for toggling schedule enabled status."""
|
||||
|
||||
def test_toggle_enabled_to_disabled(self, test_db, sample_config_file):
|
||||
def test_toggle_enabled_to_disabled(self, db, sample_db_config):
|
||||
"""Test disabling an enabled schedule."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -422,13 +422,13 @@ class TestScheduleServiceToggle:
|
||||
assert result['enabled'] is False
|
||||
assert result['next_run'] is None
|
||||
|
||||
def test_toggle_disabled_to_enabled(self, test_db, sample_config_file):
|
||||
def test_toggle_disabled_to_enabled(self, db, sample_db_config):
|
||||
"""Test enabling a disabled schedule."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=False
|
||||
)
|
||||
@@ -442,13 +442,13 @@ class TestScheduleServiceToggle:
|
||||
class TestScheduleServiceRunTimes:
|
||||
"""Tests for updating run times."""
|
||||
|
||||
def test_update_run_times(self, test_db, sample_config_file):
|
||||
def test_update_run_times(self, db, sample_db_config):
|
||||
"""Test updating last_run and next_run."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -463,9 +463,9 @@ class TestScheduleServiceRunTimes:
|
||||
assert schedule['last_run'] is not None
|
||||
assert schedule['next_run'] is not None
|
||||
|
||||
def test_update_run_times_not_found(self, test_db):
|
||||
def test_update_run_times_not_found(self, db):
|
||||
"""Test updating run times for nonexistent schedule."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="Schedule .* not found"):
|
||||
service.update_run_times(
|
||||
@@ -478,9 +478,9 @@ class TestScheduleServiceRunTimes:
|
||||
class TestCronValidation:
|
||||
"""Tests for cron expression validation."""
|
||||
|
||||
def test_validate_cron_valid_expressions(self, test_db):
|
||||
def test_validate_cron_valid_expressions(self, db):
|
||||
"""Test validating various valid cron expressions."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
valid_expressions = [
|
||||
'0 0 * * *', # Daily at midnight
|
||||
@@ -496,9 +496,9 @@ class TestCronValidation:
|
||||
assert is_valid is True, f"Expression '{expr}' should be valid"
|
||||
assert error is None
|
||||
|
||||
def test_validate_cron_invalid_expressions(self, test_db):
|
||||
def test_validate_cron_invalid_expressions(self, db):
|
||||
"""Test validating invalid cron expressions."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
invalid_expressions = [
|
||||
'invalid',
|
||||
@@ -518,9 +518,9 @@ class TestCronValidation:
|
||||
class TestNextRunCalculation:
|
||||
"""Tests for next run time calculation."""
|
||||
|
||||
def test_calculate_next_run(self, test_db):
|
||||
def test_calculate_next_run(self, db):
|
||||
"""Test calculating next run time."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
# Daily at 2 AM
|
||||
next_run = service.calculate_next_run('0 2 * * *')
|
||||
@@ -529,9 +529,9 @@ class TestNextRunCalculation:
|
||||
assert isinstance(next_run, datetime)
|
||||
assert next_run > datetime.utcnow()
|
||||
|
||||
def test_calculate_next_run_from_time(self, test_db):
|
||||
def test_calculate_next_run_from_time(self, db):
|
||||
"""Test calculating next run from specific time."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
base_time = datetime(2025, 1, 1, 0, 0, 0)
|
||||
next_run = service.calculate_next_run('0 2 * * *', from_time=base_time)
|
||||
@@ -540,9 +540,9 @@ class TestNextRunCalculation:
|
||||
assert next_run.hour == 2
|
||||
assert next_run.minute == 0
|
||||
|
||||
def test_calculate_next_run_invalid_cron(self, test_db):
|
||||
def test_calculate_next_run_invalid_cron(self, db):
|
||||
"""Test calculating next run with invalid cron raises error."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
with pytest.raises(ValueError, match="Invalid cron expression"):
|
||||
service.calculate_next_run('invalid cron')
|
||||
@@ -551,13 +551,13 @@ class TestNextRunCalculation:
|
||||
class TestScheduleHistory:
|
||||
"""Tests for schedule execution history."""
|
||||
|
||||
def test_get_schedule_history_empty(self, test_db, sample_config_file):
|
||||
def test_get_schedule_history_empty(self, db, sample_db_config):
|
||||
"""Test getting history for schedule with no executions."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -565,13 +565,13 @@ class TestScheduleHistory:
|
||||
history = service.get_schedule_history(schedule_id)
|
||||
assert len(history) == 0
|
||||
|
||||
def test_get_schedule_history_with_scans(self, test_db, sample_config_file):
|
||||
def test_get_schedule_history_with_scans(self, db, sample_db_config):
|
||||
"""Test getting history with multiple scans."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -581,26 +581,26 @@ class TestScheduleHistory:
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow() - timedelta(days=i),
|
||||
status='completed',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title=f'Scan {i}',
|
||||
triggered_by='scheduled',
|
||||
schedule_id=schedule_id
|
||||
)
|
||||
test_db.add(scan)
|
||||
test_db.commit()
|
||||
db.add(scan)
|
||||
db.commit()
|
||||
|
||||
# Get history (default limit 10)
|
||||
history = service.get_schedule_history(schedule_id, limit=10)
|
||||
assert len(history) == 10
|
||||
assert history[0]['title'] == 'Scan 0' # Most recent first
|
||||
|
||||
def test_get_schedule_history_custom_limit(self, test_db, sample_config_file):
|
||||
def test_get_schedule_history_custom_limit(self, db, sample_db_config):
|
||||
"""Test getting history with custom limit."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -610,13 +610,13 @@ class TestScheduleHistory:
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow() - timedelta(days=i),
|
||||
status='completed',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
title=f'Scan {i}',
|
||||
triggered_by='scheduled',
|
||||
schedule_id=schedule_id
|
||||
)
|
||||
test_db.add(scan)
|
||||
test_db.commit()
|
||||
db.add(scan)
|
||||
db.commit()
|
||||
|
||||
# Get only 5
|
||||
history = service.get_schedule_history(schedule_id, limit=5)
|
||||
@@ -626,13 +626,13 @@ class TestScheduleHistory:
|
||||
class TestScheduleSerialization:
|
||||
"""Tests for schedule serialization."""
|
||||
|
||||
def test_schedule_to_dict(self, test_db, sample_config_file):
|
||||
def test_schedule_to_dict(self, db, sample_db_config):
|
||||
"""Test converting schedule to dictionary."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test Schedule',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
@@ -642,7 +642,7 @@ class TestScheduleSerialization:
|
||||
# Verify all required fields
|
||||
assert 'id' in result
|
||||
assert 'name' in result
|
||||
assert 'config_file' in result
|
||||
assert 'config_id' in result
|
||||
assert 'cron_expression' in result
|
||||
assert 'enabled' in result
|
||||
assert 'last_run' in result
|
||||
@@ -652,13 +652,13 @@ class TestScheduleSerialization:
|
||||
assert 'updated_at' in result
|
||||
assert 'history' in result
|
||||
|
||||
def test_schedule_relative_time_formatting(self, test_db, sample_config_file):
|
||||
def test_schedule_relative_time_formatting(self, db, sample_db_config):
|
||||
"""Test relative time formatting in schedule dict."""
|
||||
service = ScheduleService(test_db)
|
||||
service = ScheduleService(db)
|
||||
|
||||
schedule_id = service.create_schedule(
|
||||
name='Test',
|
||||
config_file=sample_config_file,
|
||||
config_id=sample_db_config.id,
|
||||
cron_expression='0 2 * * *',
|
||||
enabled=True
|
||||
)
|
||||
|
||||
@@ -20,7 +20,7 @@ class TestStatsAPI:
|
||||
scan_date = today - timedelta(days=i)
|
||||
for j in range(i + 1): # Create 1, 2, 3, 4, 5 scans per day
|
||||
scan = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=scan_date,
|
||||
status='completed',
|
||||
duration=10.5
|
||||
@@ -56,7 +56,7 @@ class TestStatsAPI:
|
||||
today = datetime.utcnow()
|
||||
for i in range(10):
|
||||
scan = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=today - timedelta(days=i),
|
||||
status='completed',
|
||||
duration=10.5
|
||||
@@ -105,7 +105,7 @@ class TestStatsAPI:
|
||||
|
||||
# Create scan 5 days ago
|
||||
scan1 = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=today - timedelta(days=5),
|
||||
status='completed',
|
||||
duration=10.5
|
||||
@@ -114,7 +114,7 @@ class TestStatsAPI:
|
||||
|
||||
# Create scan 10 days ago
|
||||
scan2 = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=today - timedelta(days=10),
|
||||
status='completed',
|
||||
duration=10.5
|
||||
@@ -148,7 +148,7 @@ class TestStatsAPI:
|
||||
# 5 completed scans
|
||||
for i in range(5):
|
||||
scan = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=today - timedelta(days=i),
|
||||
status='completed',
|
||||
duration=10.5
|
||||
@@ -158,7 +158,7 @@ class TestStatsAPI:
|
||||
# 2 failed scans
|
||||
for i in range(2):
|
||||
scan = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=today - timedelta(days=i),
|
||||
status='failed',
|
||||
duration=5.0
|
||||
@@ -167,7 +167,7 @@ class TestStatsAPI:
|
||||
|
||||
# 1 running scan
|
||||
scan = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=today,
|
||||
status='running',
|
||||
duration=None
|
||||
@@ -217,7 +217,7 @@ class TestStatsAPI:
|
||||
# Create 3 scans today
|
||||
for i in range(3):
|
||||
scan = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=today,
|
||||
status='completed',
|
||||
duration=10.5
|
||||
@@ -227,7 +227,7 @@ class TestStatsAPI:
|
||||
# Create 2 scans yesterday
|
||||
for i in range(2):
|
||||
scan = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=yesterday,
|
||||
status='completed',
|
||||
duration=10.5
|
||||
@@ -250,7 +250,7 @@ class TestStatsAPI:
|
||||
# Create scans over the last 10 days
|
||||
for i in range(10):
|
||||
scan = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=today - timedelta(days=i),
|
||||
status='completed',
|
||||
duration=10.5
|
||||
@@ -275,7 +275,7 @@ class TestStatsAPI:
|
||||
"""Test scan trend returns dates in correct format."""
|
||||
# Create a scan
|
||||
scan = Scan(
|
||||
config_file='/app/configs/test.yaml',
|
||||
config_id=1,
|
||||
timestamp=datetime.utcnow(),
|
||||
status='completed',
|
||||
duration=10.5
|
||||
|
||||
@@ -1,197 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Phase 1 validation script.
|
||||
|
||||
Validates that all Phase 1 deliverables are in place and code structure is correct.
|
||||
Does not require dependencies to be installed.
|
||||
"""
|
||||
|
||||
import ast
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def validate_file_exists(file_path, description):
|
||||
"""Check if a file exists."""
|
||||
if Path(file_path).exists():
|
||||
print(f"✓ {description}: {file_path}")
|
||||
return True
|
||||
else:
|
||||
print(f"✗ {description} missing: {file_path}")
|
||||
return False
|
||||
|
||||
|
||||
def validate_directory_exists(dir_path, description):
|
||||
"""Check if a directory exists."""
|
||||
if Path(dir_path).is_dir():
|
||||
print(f"✓ {description}: {dir_path}")
|
||||
return True
|
||||
else:
|
||||
print(f"✗ {description} missing: {dir_path}")
|
||||
return False
|
||||
|
||||
|
||||
def validate_python_syntax(file_path):
|
||||
"""Validate Python file syntax."""
|
||||
try:
|
||||
with open(file_path, 'r') as f:
|
||||
ast.parse(f.read())
|
||||
return True
|
||||
except SyntaxError as e:
|
||||
print(f" ✗ Syntax error in {file_path}: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def main():
|
||||
"""Run all validation checks."""
|
||||
print("=" * 70)
|
||||
print("SneakyScanner Phase 1 Validation")
|
||||
print("=" * 70)
|
||||
|
||||
all_passed = True
|
||||
|
||||
# Check project structure
|
||||
print("\n1. Project Structure:")
|
||||
print("-" * 70)
|
||||
|
||||
structure_checks = [
|
||||
("web/", "Web application directory"),
|
||||
("web/api/", "API blueprints directory"),
|
||||
("web/templates/", "Jinja2 templates directory"),
|
||||
("web/static/", "Static files directory"),
|
||||
("web/utils/", "Utility modules directory"),
|
||||
("migrations/", "Alembic migrations directory"),
|
||||
("migrations/versions/", "Migration versions directory"),
|
||||
]
|
||||
|
||||
for path, desc in structure_checks:
|
||||
if not validate_directory_exists(path, desc):
|
||||
all_passed = False
|
||||
|
||||
# Check core files
|
||||
print("\n2. Core Files:")
|
||||
print("-" * 70)
|
||||
|
||||
core_files = [
|
||||
("requirements-web.txt", "Web dependencies"),
|
||||
("alembic.ini", "Alembic configuration"),
|
||||
("init_db.py", "Database initialization script"),
|
||||
("docker-compose-web.yml", "Docker Compose for web app"),
|
||||
]
|
||||
|
||||
for path, desc in core_files:
|
||||
if not validate_file_exists(path, desc):
|
||||
all_passed = False
|
||||
|
||||
# Check Python modules
|
||||
print("\n3. Python Modules:")
|
||||
print("-" * 70)
|
||||
|
||||
python_modules = [
|
||||
("web/__init__.py", "Web package init"),
|
||||
("web/models.py", "SQLAlchemy models"),
|
||||
("web/app.py", "Flask application factory"),
|
||||
("web/utils/__init__.py", "Utils package init"),
|
||||
("web/utils/settings.py", "Settings manager"),
|
||||
("web/api/__init__.py", "API package init"),
|
||||
("web/api/scans.py", "Scans API blueprint"),
|
||||
("web/api/schedules.py", "Schedules API blueprint"),
|
||||
("web/api/alerts.py", "Alerts API blueprint"),
|
||||
("web/api/settings.py", "Settings API blueprint"),
|
||||
("migrations/env.py", "Alembic environment"),
|
||||
("migrations/script.py.mako", "Migration template"),
|
||||
("migrations/versions/001_initial_schema.py", "Initial migration"),
|
||||
]
|
||||
|
||||
for path, desc in python_modules:
|
||||
exists = validate_file_exists(path, desc)
|
||||
if exists:
|
||||
# Skip syntax check for .mako templates (they're not pure Python)
|
||||
if not path.endswith('.mako'):
|
||||
if not validate_python_syntax(path):
|
||||
all_passed = False
|
||||
else:
|
||||
print(f" (Skipped syntax check for template file)")
|
||||
else:
|
||||
all_passed = False
|
||||
|
||||
# Check models
|
||||
print("\n4. Database Models (from models.py):")
|
||||
print("-" * 70)
|
||||
|
||||
try:
|
||||
# Read models.py and look for class definitions
|
||||
with open('web/models.py', 'r') as f:
|
||||
content = f.read()
|
||||
tree = ast.parse(content)
|
||||
|
||||
models = []
|
||||
for node in ast.walk(tree):
|
||||
if isinstance(node, ast.ClassDef) and node.name != 'Base':
|
||||
models.append(node.name)
|
||||
|
||||
expected_models = [
|
||||
'Scan', 'ScanSite', 'ScanIP', 'ScanPort', 'ScanService',
|
||||
'ScanCertificate', 'ScanTLSVersion', 'Schedule', 'Alert',
|
||||
'AlertRule', 'Setting'
|
||||
]
|
||||
|
||||
for model in expected_models:
|
||||
if model in models:
|
||||
print(f"✓ Model defined: {model}")
|
||||
else:
|
||||
print(f"✗ Model missing: {model}")
|
||||
all_passed = False
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to parse models.py: {e}")
|
||||
all_passed = False
|
||||
|
||||
# Check API endpoints
|
||||
print("\n5. API Blueprints:")
|
||||
print("-" * 70)
|
||||
|
||||
blueprints = {
|
||||
'web/api/scans.py': ['list_scans', 'get_scan', 'trigger_scan', 'delete_scan'],
|
||||
'web/api/schedules.py': ['list_schedules', 'get_schedule', 'create_schedule'],
|
||||
'web/api/alerts.py': ['list_alerts', 'list_alert_rules'],
|
||||
'web/api/settings.py': ['get_settings', 'update_settings'],
|
||||
}
|
||||
|
||||
for blueprint_file, expected_funcs in blueprints.items():
|
||||
try:
|
||||
with open(blueprint_file, 'r') as f:
|
||||
content = f.read()
|
||||
tree = ast.parse(content)
|
||||
|
||||
functions = [node.name for node in ast.walk(tree) if isinstance(node, ast.FunctionDef)]
|
||||
|
||||
print(f"\n {blueprint_file}:")
|
||||
for func in expected_funcs:
|
||||
if func in functions:
|
||||
print(f" ✓ Endpoint: {func}")
|
||||
else:
|
||||
print(f" ✗ Missing endpoint: {func}")
|
||||
all_passed = False
|
||||
except Exception as e:
|
||||
print(f" ✗ Failed to parse {blueprint_file}: {e}")
|
||||
all_passed = False
|
||||
|
||||
# Summary
|
||||
print("\n" + "=" * 70)
|
||||
if all_passed:
|
||||
print("✓ All Phase 1 validation checks passed!")
|
||||
print("\nNext steps:")
|
||||
print("1. Install dependencies: pip install -r requirements-web.txt")
|
||||
print("2. Initialize database: python3 init_db.py --password YOUR_PASSWORD")
|
||||
print("3. Run Flask app: python3 -m web.app")
|
||||
print("4. Test API: curl http://localhost:5000/api/settings/health")
|
||||
return 0
|
||||
else:
|
||||
print("✗ Some validation checks failed. Please review errors above.")
|
||||
return 1
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main())
|
||||
@@ -4,9 +4,13 @@ Alerts API blueprint.
|
||||
Handles endpoints for viewing alert history and managing alert rules.
|
||||
"""
|
||||
|
||||
from flask import Blueprint, jsonify, request
|
||||
import json
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from flask import Blueprint, jsonify, request, current_app
|
||||
|
||||
from web.auth.decorators import api_auth_required
|
||||
from web.models import Alert, AlertRule, Scan
|
||||
from web.services.alert_service import AlertService
|
||||
|
||||
bp = Blueprint('alerts', __name__)
|
||||
|
||||
@@ -22,22 +26,167 @@ def list_alerts():
|
||||
per_page: Items per page (default: 20)
|
||||
alert_type: Filter by alert type
|
||||
severity: Filter by severity (info, warning, critical)
|
||||
start_date: Filter alerts after this date
|
||||
end_date: Filter alerts before this date
|
||||
acknowledged: Filter by acknowledgment status (true/false)
|
||||
scan_id: Filter by specific scan
|
||||
start_date: Filter alerts after this date (ISO format)
|
||||
end_date: Filter alerts before this date (ISO format)
|
||||
|
||||
Returns:
|
||||
JSON response with alerts list
|
||||
"""
|
||||
# TODO: Implement in Phase 4
|
||||
# Get query parameters
|
||||
page = request.args.get('page', 1, type=int)
|
||||
per_page = min(request.args.get('per_page', 20, type=int), 100) # Max 100 items
|
||||
alert_type = request.args.get('alert_type')
|
||||
severity = request.args.get('severity')
|
||||
acknowledged = request.args.get('acknowledged')
|
||||
scan_id = request.args.get('scan_id', type=int)
|
||||
start_date = request.args.get('start_date')
|
||||
end_date = request.args.get('end_date')
|
||||
|
||||
# Build query
|
||||
query = current_app.db_session.query(Alert)
|
||||
|
||||
# Apply filters
|
||||
if alert_type:
|
||||
query = query.filter(Alert.alert_type == alert_type)
|
||||
if severity:
|
||||
query = query.filter(Alert.severity == severity)
|
||||
if acknowledged is not None:
|
||||
ack_bool = acknowledged.lower() == 'true'
|
||||
query = query.filter(Alert.acknowledged == ack_bool)
|
||||
if scan_id:
|
||||
query = query.filter(Alert.scan_id == scan_id)
|
||||
if start_date:
|
||||
try:
|
||||
start_dt = datetime.fromisoformat(start_date.replace('Z', '+00:00'))
|
||||
query = query.filter(Alert.created_at >= start_dt)
|
||||
except ValueError:
|
||||
pass # Ignore invalid date format
|
||||
if end_date:
|
||||
try:
|
||||
end_dt = datetime.fromisoformat(end_date.replace('Z', '+00:00'))
|
||||
query = query.filter(Alert.created_at <= end_dt)
|
||||
except ValueError:
|
||||
pass # Ignore invalid date format
|
||||
|
||||
# Order by severity and date
|
||||
query = query.order_by(
|
||||
Alert.severity.desc(), # Critical first, then warning, then info
|
||||
Alert.created_at.desc() # Most recent first
|
||||
)
|
||||
|
||||
# Paginate
|
||||
total = query.count()
|
||||
alerts = query.offset((page - 1) * per_page).limit(per_page).all()
|
||||
|
||||
# Format response
|
||||
alerts_data = []
|
||||
for alert in alerts:
|
||||
# Get scan info
|
||||
scan = current_app.db_session.query(Scan).filter(Scan.id == alert.scan_id).first()
|
||||
|
||||
alerts_data.append({
|
||||
'id': alert.id,
|
||||
'scan_id': alert.scan_id,
|
||||
'scan_title': scan.title if scan else None,
|
||||
'rule_id': alert.rule_id,
|
||||
'alert_type': alert.alert_type,
|
||||
'severity': alert.severity,
|
||||
'message': alert.message,
|
||||
'ip_address': alert.ip_address,
|
||||
'port': alert.port,
|
||||
'acknowledged': alert.acknowledged,
|
||||
'acknowledged_at': alert.acknowledged_at.isoformat() if alert.acknowledged_at else None,
|
||||
'acknowledged_by': alert.acknowledged_by,
|
||||
'email_sent': alert.email_sent,
|
||||
'email_sent_at': alert.email_sent_at.isoformat() if alert.email_sent_at else None,
|
||||
'webhook_sent': alert.webhook_sent,
|
||||
'webhook_sent_at': alert.webhook_sent_at.isoformat() if alert.webhook_sent_at else None,
|
||||
'created_at': alert.created_at.isoformat()
|
||||
})
|
||||
|
||||
return jsonify({
|
||||
'alerts': [],
|
||||
'total': 0,
|
||||
'page': 1,
|
||||
'per_page': 20,
|
||||
'message': 'Alerts list endpoint - to be implemented in Phase 4'
|
||||
'alerts': alerts_data,
|
||||
'total': total,
|
||||
'page': page,
|
||||
'per_page': per_page,
|
||||
'pages': (total + per_page - 1) // per_page # Ceiling division
|
||||
})
|
||||
|
||||
|
||||
@bp.route('/<int:alert_id>/acknowledge', methods=['POST'])
|
||||
@api_auth_required
|
||||
def acknowledge_alert(alert_id):
|
||||
"""
|
||||
Acknowledge an alert.
|
||||
|
||||
Args:
|
||||
alert_id: Alert ID to acknowledge
|
||||
|
||||
Returns:
|
||||
JSON response with acknowledgment status
|
||||
"""
|
||||
# Get username from auth context or default to 'api'
|
||||
acknowledged_by = request.json.get('acknowledged_by', 'api') if request.json else 'api'
|
||||
|
||||
alert_service = AlertService(current_app.db_session)
|
||||
success = alert_service.acknowledge_alert(alert_id, acknowledged_by)
|
||||
|
||||
if success:
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': f'Alert {alert_id} acknowledged',
|
||||
'acknowledged_by': acknowledged_by
|
||||
})
|
||||
else:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to acknowledge alert {alert_id}'
|
||||
}), 400
|
||||
|
||||
|
||||
@bp.route('/acknowledge-all', methods=['POST'])
|
||||
@api_auth_required
|
||||
def acknowledge_all_alerts():
|
||||
"""
|
||||
Acknowledge all unacknowledged alerts.
|
||||
|
||||
Returns:
|
||||
JSON response with count of acknowledged alerts
|
||||
"""
|
||||
acknowledged_by = request.json.get('acknowledged_by', 'api') if request.json else 'api'
|
||||
|
||||
try:
|
||||
# Get all unacknowledged alerts
|
||||
unacked_alerts = current_app.db_session.query(Alert).filter(
|
||||
Alert.acknowledged == False
|
||||
).all()
|
||||
|
||||
count = 0
|
||||
for alert in unacked_alerts:
|
||||
alert.acknowledged = True
|
||||
alert.acknowledged_at = datetime.now(timezone.utc)
|
||||
alert.acknowledged_by = acknowledged_by
|
||||
count += 1
|
||||
|
||||
current_app.db_session.commit()
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': f'Acknowledged {count} alerts',
|
||||
'count': count,
|
||||
'acknowledged_by': acknowledged_by
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
current_app.db_session.rollback()
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to acknowledge alerts: {str(e)}'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/rules', methods=['GET'])
|
||||
@api_auth_required
|
||||
def list_alert_rules():
|
||||
@@ -47,10 +196,29 @@ def list_alert_rules():
|
||||
Returns:
|
||||
JSON response with alert rules
|
||||
"""
|
||||
# TODO: Implement in Phase 4
|
||||
rules = current_app.db_session.query(AlertRule).order_by(AlertRule.name, AlertRule.rule_type).all()
|
||||
|
||||
rules_data = []
|
||||
for rule in rules:
|
||||
rules_data.append({
|
||||
'id': rule.id,
|
||||
'name': rule.name,
|
||||
'rule_type': rule.rule_type,
|
||||
'enabled': rule.enabled,
|
||||
'threshold': rule.threshold,
|
||||
'email_enabled': rule.email_enabled,
|
||||
'webhook_enabled': rule.webhook_enabled,
|
||||
'severity': rule.severity,
|
||||
'filter_conditions': json.loads(rule.filter_conditions) if rule.filter_conditions else None,
|
||||
'config_id': rule.config_id,
|
||||
'config_title': rule.config.title if rule.config else None,
|
||||
'created_at': rule.created_at.isoformat(),
|
||||
'updated_at': rule.updated_at.isoformat() if rule.updated_at else None
|
||||
})
|
||||
|
||||
return jsonify({
|
||||
'rules': [],
|
||||
'message': 'Alert rules list endpoint - to be implemented in Phase 4'
|
||||
'rules': rules_data,
|
||||
'total': len(rules_data)
|
||||
})
|
||||
|
||||
|
||||
@@ -61,23 +229,100 @@ def create_alert_rule():
|
||||
Create a new alert rule.
|
||||
|
||||
Request body:
|
||||
rule_type: Type of alert rule
|
||||
threshold: Threshold value (e.g., days for cert expiry)
|
||||
name: User-friendly rule name
|
||||
rule_type: Type of alert rule (unexpected_port, drift_detection, cert_expiry, weak_tls, ping_failed)
|
||||
threshold: Threshold value (e.g., days for cert expiry, percentage for drift)
|
||||
enabled: Whether rule is active (default: true)
|
||||
email_enabled: Send email for this rule (default: false)
|
||||
webhook_enabled: Send webhook for this rule (default: false)
|
||||
severity: Alert severity (critical, warning, info)
|
||||
filter_conditions: JSON object with filter conditions
|
||||
config_id: Optional config ID to apply rule to
|
||||
|
||||
Returns:
|
||||
JSON response with created rule ID
|
||||
JSON response with created rule
|
||||
"""
|
||||
# TODO: Implement in Phase 4
|
||||
data = request.get_json() or {}
|
||||
|
||||
return jsonify({
|
||||
'rule_id': None,
|
||||
'status': 'not_implemented',
|
||||
'message': 'Alert rule creation endpoint - to be implemented in Phase 4',
|
||||
'data': data
|
||||
}), 501
|
||||
# Validate required fields
|
||||
if not data.get('rule_type'):
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'rule_type is required'
|
||||
}), 400
|
||||
|
||||
# Valid rule types
|
||||
valid_rule_types = ['unexpected_port', 'drift_detection', 'cert_expiry', 'weak_tls', 'ping_failed']
|
||||
if data['rule_type'] not in valid_rule_types:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Invalid rule_type. Must be one of: {", ".join(valid_rule_types)}'
|
||||
}), 400
|
||||
|
||||
# Valid severities
|
||||
valid_severities = ['critical', 'warning', 'info']
|
||||
if data.get('severity') and data['severity'] not in valid_severities:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Invalid severity. Must be one of: {", ".join(valid_severities)}'
|
||||
}), 400
|
||||
|
||||
try:
|
||||
# Validate config_id if provided
|
||||
config_id = data.get('config_id')
|
||||
if config_id:
|
||||
from web.models import ScanConfig
|
||||
config = current_app.db_session.query(ScanConfig).filter_by(id=config_id).first()
|
||||
if not config:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Config with ID {config_id} not found'
|
||||
}), 400
|
||||
|
||||
# Create new rule
|
||||
rule = AlertRule(
|
||||
name=data.get('name', f"{data['rule_type']} rule"),
|
||||
rule_type=data['rule_type'],
|
||||
enabled=data.get('enabled', True),
|
||||
threshold=data.get('threshold'),
|
||||
email_enabled=data.get('email_enabled', False),
|
||||
webhook_enabled=data.get('webhook_enabled', False),
|
||||
severity=data.get('severity', 'warning'),
|
||||
filter_conditions=json.dumps(data['filter_conditions']) if data.get('filter_conditions') else None,
|
||||
config_id=config_id,
|
||||
created_at=datetime.now(timezone.utc),
|
||||
updated_at=datetime.now(timezone.utc)
|
||||
)
|
||||
|
||||
current_app.db_session.add(rule)
|
||||
current_app.db_session.commit()
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'Alert rule created successfully',
|
||||
'rule': {
|
||||
'id': rule.id,
|
||||
'name': rule.name,
|
||||
'rule_type': rule.rule_type,
|
||||
'enabled': rule.enabled,
|
||||
'threshold': rule.threshold,
|
||||
'email_enabled': rule.email_enabled,
|
||||
'webhook_enabled': rule.webhook_enabled,
|
||||
'severity': rule.severity,
|
||||
'filter_conditions': json.loads(rule.filter_conditions) if rule.filter_conditions else None,
|
||||
'config_id': rule.config_id,
|
||||
'config_title': rule.config.title if rule.config else None,
|
||||
'created_at': rule.created_at.isoformat(),
|
||||
'updated_at': rule.updated_at.isoformat()
|
||||
}
|
||||
}), 201
|
||||
|
||||
except Exception as e:
|
||||
current_app.db_session.rollback()
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to create alert rule: {str(e)}'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/rules/<int:rule_id>', methods=['PUT'])
|
||||
@@ -90,22 +335,97 @@ def update_alert_rule(rule_id):
|
||||
rule_id: Alert rule ID to update
|
||||
|
||||
Request body:
|
||||
name: User-friendly rule name (optional)
|
||||
threshold: Threshold value (optional)
|
||||
enabled: Whether rule is active (optional)
|
||||
email_enabled: Send email for this rule (optional)
|
||||
webhook_enabled: Send webhook for this rule (optional)
|
||||
severity: Alert severity (optional)
|
||||
filter_conditions: JSON object with filter conditions (optional)
|
||||
config_id: Config ID to apply rule to (optional)
|
||||
|
||||
Returns:
|
||||
JSON response with update status
|
||||
"""
|
||||
# TODO: Implement in Phase 4
|
||||
data = request.get_json() or {}
|
||||
|
||||
return jsonify({
|
||||
'rule_id': rule_id,
|
||||
'status': 'not_implemented',
|
||||
'message': 'Alert rule update endpoint - to be implemented in Phase 4',
|
||||
'data': data
|
||||
}), 501
|
||||
# Get existing rule
|
||||
rule = current_app.db_session.query(AlertRule).filter(AlertRule.id == rule_id).first()
|
||||
if not rule:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Alert rule {rule_id} not found'
|
||||
}), 404
|
||||
|
||||
# Valid severities
|
||||
valid_severities = ['critical', 'warning', 'info']
|
||||
if data.get('severity') and data['severity'] not in valid_severities:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Invalid severity. Must be one of: {", ".join(valid_severities)}'
|
||||
}), 400
|
||||
|
||||
try:
|
||||
# Validate config_id if provided
|
||||
if 'config_id' in data:
|
||||
config_id = data['config_id']
|
||||
if config_id:
|
||||
from web.models import ScanConfig
|
||||
config = current_app.db_session.query(ScanConfig).filter_by(id=config_id).first()
|
||||
if not config:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Config with ID {config_id} not found'
|
||||
}), 400
|
||||
|
||||
# Update fields if provided
|
||||
if 'name' in data:
|
||||
rule.name = data['name']
|
||||
if 'threshold' in data:
|
||||
rule.threshold = data['threshold']
|
||||
if 'enabled' in data:
|
||||
rule.enabled = data['enabled']
|
||||
if 'email_enabled' in data:
|
||||
rule.email_enabled = data['email_enabled']
|
||||
if 'webhook_enabled' in data:
|
||||
rule.webhook_enabled = data['webhook_enabled']
|
||||
if 'severity' in data:
|
||||
rule.severity = data['severity']
|
||||
if 'filter_conditions' in data:
|
||||
rule.filter_conditions = json.dumps(data['filter_conditions']) if data['filter_conditions'] else None
|
||||
if 'config_id' in data:
|
||||
rule.config_id = data['config_id']
|
||||
|
||||
rule.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
current_app.db_session.commit()
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'Alert rule updated successfully',
|
||||
'rule': {
|
||||
'id': rule.id,
|
||||
'name': rule.name,
|
||||
'rule_type': rule.rule_type,
|
||||
'enabled': rule.enabled,
|
||||
'threshold': rule.threshold,
|
||||
'email_enabled': rule.email_enabled,
|
||||
'webhook_enabled': rule.webhook_enabled,
|
||||
'severity': rule.severity,
|
||||
'filter_conditions': json.loads(rule.filter_conditions) if rule.filter_conditions else None,
|
||||
'config_id': rule.config_id,
|
||||
'config_title': rule.config.title if rule.config else None,
|
||||
'created_at': rule.created_at.isoformat(),
|
||||
'updated_at': rule.updated_at.isoformat()
|
||||
}
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
current_app.db_session.rollback()
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to update alert rule: {str(e)}'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/rules/<int:rule_id>', methods=['DELETE'])
|
||||
@@ -120,12 +440,83 @@ def delete_alert_rule(rule_id):
|
||||
Returns:
|
||||
JSON response with deletion status
|
||||
"""
|
||||
# TODO: Implement in Phase 4
|
||||
# Get existing rule
|
||||
rule = current_app.db_session.query(AlertRule).filter(AlertRule.id == rule_id).first()
|
||||
if not rule:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Alert rule {rule_id} not found'
|
||||
}), 404
|
||||
|
||||
try:
|
||||
# Delete the rule (cascade will delete related alerts)
|
||||
current_app.db_session.delete(rule)
|
||||
current_app.db_session.commit()
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': f'Alert rule {rule_id} deleted successfully'
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
current_app.db_session.rollback()
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to delete alert rule: {str(e)}'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/stats', methods=['GET'])
|
||||
@api_auth_required
|
||||
def alert_stats():
|
||||
"""
|
||||
Get alert statistics.
|
||||
|
||||
Query params:
|
||||
days: Number of days to look back (default: 7)
|
||||
|
||||
Returns:
|
||||
JSON response with alert statistics
|
||||
"""
|
||||
days = request.args.get('days', 7, type=int)
|
||||
cutoff_date = datetime.now(timezone.utc) - timedelta(days=days)
|
||||
|
||||
# Get alerts in date range
|
||||
alerts = current_app.db_session.query(Alert).filter(Alert.created_at >= cutoff_date).all()
|
||||
|
||||
# Calculate statistics
|
||||
total_alerts = len(alerts)
|
||||
alerts_by_severity = {'critical': 0, 'warning': 0, 'info': 0}
|
||||
alerts_by_type = {}
|
||||
unacknowledged_count = 0
|
||||
|
||||
for alert in alerts:
|
||||
# Count by severity
|
||||
if alert.severity in alerts_by_severity:
|
||||
alerts_by_severity[alert.severity] += 1
|
||||
|
||||
# Count by type
|
||||
if alert.alert_type not in alerts_by_type:
|
||||
alerts_by_type[alert.alert_type] = 0
|
||||
alerts_by_type[alert.alert_type] += 1
|
||||
|
||||
# Count unacknowledged
|
||||
if not alert.acknowledged:
|
||||
unacknowledged_count += 1
|
||||
|
||||
return jsonify({
|
||||
'rule_id': rule_id,
|
||||
'status': 'not_implemented',
|
||||
'message': 'Alert rule deletion endpoint - to be implemented in Phase 4'
|
||||
}), 501
|
||||
'stats': {
|
||||
'total_alerts': total_alerts,
|
||||
'unacknowledged_count': unacknowledged_count,
|
||||
'alerts_by_severity': alerts_by_severity,
|
||||
'alerts_by_type': alerts_by_type,
|
||||
'date_range': {
|
||||
'start': cutoff_date.isoformat(),
|
||||
'end': datetime.now(timezone.utc).isoformat(),
|
||||
'days': days
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
|
||||
# Health check endpoint
|
||||
@@ -140,5 +531,5 @@ def health_check():
|
||||
return jsonify({
|
||||
'status': 'healthy',
|
||||
'api': 'alerts',
|
||||
'version': '1.0.0-phase1'
|
||||
})
|
||||
'version': '1.0.0-phase5'
|
||||
})
|
||||
@@ -1,14 +1,12 @@
|
||||
"""
|
||||
Configs API blueprint.
|
||||
|
||||
Handles endpoints for managing scan configuration files, including CSV/YAML upload,
|
||||
template download, and config management.
|
||||
Handles endpoints for managing scan configurations stored in the database.
|
||||
Provides REST API for creating, updating, and deleting configs that reference sites.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import io
|
||||
from flask import Blueprint, jsonify, request, send_file
|
||||
from werkzeug.utils import secure_filename
|
||||
from flask import Blueprint, jsonify, request, current_app
|
||||
|
||||
from web.auth.decorators import api_auth_required
|
||||
from web.services.config_service import ConfigService
|
||||
@@ -17,32 +15,40 @@ bp = Blueprint('configs', __name__)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Database-based Config Endpoints (Primary)
|
||||
# ============================================================================
|
||||
|
||||
@bp.route('', methods=['GET'])
|
||||
@api_auth_required
|
||||
def list_configs():
|
||||
"""
|
||||
List all config files with metadata.
|
||||
List all scan configurations from database.
|
||||
|
||||
Returns:
|
||||
JSON response with list of configs:
|
||||
{
|
||||
"configs": [
|
||||
{
|
||||
"filename": "prod-scan.yaml",
|
||||
"title": "Prod Scan",
|
||||
"path": "/app/configs/prod-scan.yaml",
|
||||
"created_at": "2025-11-15T10:30:00Z",
|
||||
"size_bytes": 1234,
|
||||
"used_by_schedules": ["Daily Scan"]
|
||||
"id": 1,
|
||||
"title": "Production Scan",
|
||||
"description": "Weekly production scan",
|
||||
"site_count": 3,
|
||||
"sites": [
|
||||
{"id": 1, "name": "Production DC"},
|
||||
{"id": 2, "name": "DMZ"}
|
||||
],
|
||||
"created_at": "2025-11-19T10:30:00Z",
|
||||
"updated_at": "2025-11-19T10:30:00Z"
|
||||
}
|
||||
]
|
||||
}
|
||||
"""
|
||||
try:
|
||||
config_service = ConfigService()
|
||||
configs = config_service.list_configs()
|
||||
config_service = ConfigService(db_session=current_app.db_session)
|
||||
configs = config_service.list_configs_db()
|
||||
|
||||
logger.info(f"Listed {len(configs)} config files")
|
||||
logger.info(f"Listed {len(configs)} configs from database")
|
||||
|
||||
return jsonify({
|
||||
'configs': configs
|
||||
@@ -56,78 +62,38 @@ def list_configs():
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<filename>', methods=['GET'])
|
||||
@bp.route('', methods=['POST'])
|
||||
@api_auth_required
|
||||
def get_config(filename: str):
|
||||
def create_config():
|
||||
"""
|
||||
Get config file content and parsed data.
|
||||
|
||||
Args:
|
||||
filename: Config filename
|
||||
|
||||
Returns:
|
||||
JSON response with config content:
|
||||
{
|
||||
"filename": "prod-scan.yaml",
|
||||
"content": "title: Prod Scan\n...",
|
||||
"parsed": {"title": "Prod Scan", "sites": [...]}
|
||||
}
|
||||
"""
|
||||
try:
|
||||
# Sanitize filename
|
||||
filename = secure_filename(filename)
|
||||
|
||||
config_service = ConfigService()
|
||||
config_data = config_service.get_config(filename)
|
||||
|
||||
logger.info(f"Retrieved config file: {filename}")
|
||||
|
||||
return jsonify(config_data)
|
||||
|
||||
except FileNotFoundError as e:
|
||||
logger.warning(f"Config file not found: {filename}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': str(e)
|
||||
}), 404
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Invalid config file: {filename} - {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid config',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error getting config {filename}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/create-from-cidr', methods=['POST'])
|
||||
@api_auth_required
|
||||
def create_from_cidr():
|
||||
"""
|
||||
Create config from CIDR range.
|
||||
Create a new scan configuration in the database.
|
||||
|
||||
Request:
|
||||
JSON with:
|
||||
{
|
||||
"title": "My Scan",
|
||||
"cidr": "10.0.0.0/24",
|
||||
"site_name": "Production" (optional),
|
||||
"ping_default": false (optional)
|
||||
"title": "Production Scan",
|
||||
"description": "Weekly production scan (optional)",
|
||||
"site_ids": [1, 2, 3]
|
||||
}
|
||||
|
||||
Returns:
|
||||
JSON response with created config info:
|
||||
JSON response with created config:
|
||||
{
|
||||
"success": true,
|
||||
"filename": "my-scan.yaml",
|
||||
"preview": "title: My Scan\n..."
|
||||
"config": {
|
||||
"id": 1,
|
||||
"title": "Production Scan",
|
||||
"description": "...",
|
||||
"site_count": 3,
|
||||
"sites": [...],
|
||||
"created_at": "2025-11-19T10:30:00Z",
|
||||
"updated_at": "2025-11-19T10:30:00Z"
|
||||
}
|
||||
}
|
||||
|
||||
Error responses:
|
||||
- 400: Validation error or missing fields
|
||||
- 500: Internal server error
|
||||
"""
|
||||
try:
|
||||
data = request.get_json()
|
||||
@@ -145,272 +111,192 @@ def create_from_cidr():
|
||||
'message': 'Missing required field: title'
|
||||
}), 400
|
||||
|
||||
if 'cidr' not in data:
|
||||
if 'site_ids' not in data:
|
||||
return jsonify({
|
||||
'error': 'Bad request',
|
||||
'message': 'Missing required field: cidr'
|
||||
'message': 'Missing required field: site_ids'
|
||||
}), 400
|
||||
|
||||
title = data['title']
|
||||
cidr = data['cidr']
|
||||
site_name = data.get('site_name', None)
|
||||
ping_default = data.get('ping_default', False)
|
||||
description = data.get('description', None)
|
||||
site_ids = data['site_ids']
|
||||
|
||||
# Validate title
|
||||
if not title or not title.strip():
|
||||
if not isinstance(site_ids, list):
|
||||
return jsonify({
|
||||
'error': 'Validation error',
|
||||
'message': 'Title cannot be empty'
|
||||
'error': 'Bad request',
|
||||
'message': 'Field site_ids must be an array'
|
||||
}), 400
|
||||
|
||||
# Create config from CIDR
|
||||
config_service = ConfigService()
|
||||
filename, yaml_preview = config_service.create_from_cidr(
|
||||
title=title,
|
||||
cidr=cidr,
|
||||
site_name=site_name,
|
||||
ping_default=ping_default
|
||||
)
|
||||
# Create config
|
||||
config_service = ConfigService(db_session=current_app.db_session)
|
||||
config = config_service.create_config(title, description, site_ids)
|
||||
|
||||
logger.info(f"Created config from CIDR {cidr}: {filename}")
|
||||
logger.info(f"Created config: {config['title']} (ID: {config['id']})")
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'filename': filename,
|
||||
'preview': yaml_preview
|
||||
})
|
||||
'config': config
|
||||
}), 201
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"CIDR validation failed: {str(e)}")
|
||||
logger.warning(f"Config validation failed: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Validation error',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error creating config from CIDR: {str(e)}", exc_info=True)
|
||||
logger.error(f"Unexpected error creating config: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/upload-yaml', methods=['POST'])
|
||||
@bp.route('/<int:config_id>', methods=['GET'])
|
||||
@api_auth_required
|
||||
def upload_yaml():
|
||||
def get_config(config_id: int):
|
||||
"""
|
||||
Upload YAML config file directly.
|
||||
|
||||
Request:
|
||||
multipart/form-data with 'file' field containing YAML file
|
||||
Optional 'filename' field for custom filename
|
||||
|
||||
Returns:
|
||||
JSON response with created config info:
|
||||
{
|
||||
"success": true,
|
||||
"filename": "prod-scan.yaml"
|
||||
}
|
||||
"""
|
||||
try:
|
||||
# Check if file is present
|
||||
if 'file' not in request.files:
|
||||
return jsonify({
|
||||
'error': 'Bad request',
|
||||
'message': 'No file provided'
|
||||
}), 400
|
||||
|
||||
file = request.files['file']
|
||||
|
||||
# Check if file is selected
|
||||
if file.filename == '':
|
||||
return jsonify({
|
||||
'error': 'Bad request',
|
||||
'message': 'No file selected'
|
||||
}), 400
|
||||
|
||||
# Check file extension
|
||||
if not (file.filename.endswith('.yaml') or file.filename.endswith('.yml')):
|
||||
return jsonify({
|
||||
'error': 'Bad request',
|
||||
'message': 'File must be a YAML file (.yaml or .yml extension)'
|
||||
}), 400
|
||||
|
||||
# Read YAML content
|
||||
yaml_content = file.read().decode('utf-8')
|
||||
|
||||
# Get filename (use uploaded filename or custom)
|
||||
filename = request.form.get('filename', file.filename)
|
||||
filename = secure_filename(filename)
|
||||
|
||||
# Create config from YAML
|
||||
config_service = ConfigService()
|
||||
final_filename = config_service.create_from_yaml(filename, yaml_content)
|
||||
|
||||
logger.info(f"Created config from YAML upload: {final_filename}")
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'filename': final_filename
|
||||
})
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"YAML validation failed: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Validation error',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
|
||||
except UnicodeDecodeError:
|
||||
logger.warning("YAML file encoding error")
|
||||
return jsonify({
|
||||
'error': 'Encoding error',
|
||||
'message': 'YAML file must be UTF-8 encoded'
|
||||
}), 400
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error uploading YAML: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
|
||||
|
||||
@bp.route('/<filename>/download', methods=['GET'])
|
||||
@api_auth_required
|
||||
def download_config(filename: str):
|
||||
"""
|
||||
Download existing config file.
|
||||
Get a scan configuration by ID.
|
||||
|
||||
Args:
|
||||
filename: Config filename
|
||||
config_id: Configuration ID
|
||||
|
||||
Returns:
|
||||
YAML file download
|
||||
"""
|
||||
try:
|
||||
# Sanitize filename
|
||||
filename = secure_filename(filename)
|
||||
|
||||
config_service = ConfigService()
|
||||
config_data = config_service.get_config(filename)
|
||||
|
||||
# Create file-like object
|
||||
yaml_file = io.BytesIO(config_data['content'].encode('utf-8'))
|
||||
yaml_file.seek(0)
|
||||
|
||||
logger.info(f"Config file downloaded: {filename}")
|
||||
|
||||
# Send file
|
||||
return send_file(
|
||||
yaml_file,
|
||||
mimetype='application/x-yaml',
|
||||
as_attachment=True,
|
||||
download_name=filename
|
||||
)
|
||||
|
||||
except FileNotFoundError as e:
|
||||
logger.warning(f"Config file not found: {filename}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': str(e)
|
||||
}), 404
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error downloading config {filename}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<filename>', methods=['PUT'])
|
||||
@api_auth_required
|
||||
def update_config(filename: str):
|
||||
"""
|
||||
Update existing config file with new YAML content.
|
||||
|
||||
Args:
|
||||
filename: Config filename
|
||||
|
||||
Request:
|
||||
JSON with:
|
||||
JSON response with config details:
|
||||
{
|
||||
"content": "title: My Scan\nsites: ..."
|
||||
}
|
||||
|
||||
Returns:
|
||||
JSON response with success status:
|
||||
{
|
||||
"success": true,
|
||||
"message": "Config updated successfully"
|
||||
"id": 1,
|
||||
"title": "Production Scan",
|
||||
"description": "...",
|
||||
"site_count": 3,
|
||||
"sites": [
|
||||
{
|
||||
"id": 1,
|
||||
"name": "Production DC",
|
||||
"description": "...",
|
||||
"ip_count": 5
|
||||
}
|
||||
],
|
||||
"created_at": "2025-11-19T10:30:00Z",
|
||||
"updated_at": "2025-11-19T10:30:00Z"
|
||||
}
|
||||
|
||||
Error responses:
|
||||
- 400: Invalid YAML or config structure
|
||||
- 404: Config file not found
|
||||
- 404: Config not found
|
||||
- 500: Internal server error
|
||||
"""
|
||||
try:
|
||||
# Sanitize filename
|
||||
filename = secure_filename(filename)
|
||||
config_service = ConfigService(db_session=current_app.db_session)
|
||||
config = config_service.get_config_by_id(config_id)
|
||||
|
||||
data = request.get_json()
|
||||
logger.info(f"Retrieved config: {config['title']} (ID: {config_id})")
|
||||
|
||||
if not data or 'content' not in data:
|
||||
return jsonify({
|
||||
'error': 'Bad request',
|
||||
'message': 'Missing required field: content'
|
||||
}), 400
|
||||
return jsonify(config)
|
||||
|
||||
yaml_content = data['content']
|
||||
|
||||
# Update config
|
||||
config_service = ConfigService()
|
||||
config_service.update_config(filename, yaml_content)
|
||||
|
||||
logger.info(f"Updated config file: {filename}")
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'Config updated successfully'
|
||||
})
|
||||
|
||||
except FileNotFoundError as e:
|
||||
logger.warning(f"Config file not found: {filename}")
|
||||
except ValueError as e:
|
||||
logger.warning(f"Config not found: {config_id}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': str(e)
|
||||
}), 404
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Invalid config content for {filename}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Validation error',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error updating config {filename}: {str(e)}", exc_info=True)
|
||||
logger.error(f"Unexpected error getting config {config_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<filename>', methods=['DELETE'])
|
||||
@bp.route('/<int:config_id>', methods=['PUT'])
|
||||
@api_auth_required
|
||||
def delete_config(filename: str):
|
||||
def update_config(config_id: int):
|
||||
"""
|
||||
Delete config file and cascade delete associated schedules.
|
||||
|
||||
When a config is deleted, all schedules using that config (both enabled
|
||||
and disabled) are automatically deleted as well.
|
||||
Update an existing scan configuration.
|
||||
|
||||
Args:
|
||||
filename: Config filename
|
||||
config_id: Configuration ID
|
||||
|
||||
Request:
|
||||
JSON with (all fields optional):
|
||||
{
|
||||
"title": "New Title",
|
||||
"description": "New Description",
|
||||
"site_ids": [1, 2, 3]
|
||||
}
|
||||
|
||||
Returns:
|
||||
JSON response with updated config:
|
||||
{
|
||||
"success": true,
|
||||
"config": {...}
|
||||
}
|
||||
|
||||
Error responses:
|
||||
- 400: Validation error
|
||||
- 404: Config not found
|
||||
- 500: Internal server error
|
||||
"""
|
||||
try:
|
||||
data = request.get_json()
|
||||
|
||||
if not data:
|
||||
return jsonify({
|
||||
'error': 'Bad request',
|
||||
'message': 'Request body must be JSON'
|
||||
}), 400
|
||||
|
||||
title = data.get('title', None)
|
||||
description = data.get('description', None)
|
||||
site_ids = data.get('site_ids', None)
|
||||
|
||||
if site_ids is not None and not isinstance(site_ids, list):
|
||||
return jsonify({
|
||||
'error': 'Bad request',
|
||||
'message': 'Field site_ids must be an array'
|
||||
}), 400
|
||||
|
||||
# Update config
|
||||
config_service = ConfigService(db_session=current_app.db_session)
|
||||
config = config_service.update_config(config_id, title, description, site_ids)
|
||||
|
||||
logger.info(f"Updated config: {config['title']} (ID: {config_id})")
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'config': config
|
||||
})
|
||||
|
||||
except ValueError as e:
|
||||
if 'not found' in str(e).lower():
|
||||
logger.warning(f"Config not found: {config_id}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': str(e)
|
||||
}), 404
|
||||
else:
|
||||
logger.warning(f"Config validation failed: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Validation error',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error updating config {config_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:config_id>', methods=['DELETE'])
|
||||
@api_auth_required
|
||||
def delete_config(config_id: int):
|
||||
"""
|
||||
Delete a scan configuration.
|
||||
|
||||
Args:
|
||||
config_id: Configuration ID
|
||||
|
||||
Returns:
|
||||
JSON response with success status:
|
||||
@@ -420,32 +306,155 @@ def delete_config(filename: str):
|
||||
}
|
||||
|
||||
Error responses:
|
||||
- 404: Config file not found
|
||||
- 404: Config not found
|
||||
- 500: Internal server error
|
||||
"""
|
||||
try:
|
||||
# Sanitize filename
|
||||
filename = secure_filename(filename)
|
||||
config_service = ConfigService(db_session=current_app.db_session)
|
||||
config_service.delete_config(config_id)
|
||||
|
||||
config_service = ConfigService()
|
||||
config_service.delete_config(filename)
|
||||
|
||||
logger.info(f"Deleted config file: {filename}")
|
||||
logger.info(f"Deleted config (ID: {config_id})")
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'message': 'Config deleted successfully'
|
||||
})
|
||||
|
||||
except FileNotFoundError as e:
|
||||
logger.warning(f"Config file not found: {filename}")
|
||||
except ValueError as e:
|
||||
logger.warning(f"Config not found: {config_id}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': str(e)
|
||||
}), 404
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error deleting config {filename}: {str(e)}", exc_info=True)
|
||||
logger.error(f"Unexpected error deleting config {config_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:config_id>/sites', methods=['POST'])
|
||||
@api_auth_required
|
||||
def add_site_to_config(config_id: int):
|
||||
"""
|
||||
Add a site to an existing config.
|
||||
|
||||
Args:
|
||||
config_id: Configuration ID
|
||||
|
||||
Request:
|
||||
JSON with:
|
||||
{
|
||||
"site_id": 5
|
||||
}
|
||||
|
||||
Returns:
|
||||
JSON response with updated config:
|
||||
{
|
||||
"success": true,
|
||||
"config": {...}
|
||||
}
|
||||
|
||||
Error responses:
|
||||
- 400: Validation error or site already in config
|
||||
- 404: Config or site not found
|
||||
- 500: Internal server error
|
||||
"""
|
||||
try:
|
||||
data = request.get_json()
|
||||
|
||||
if not data or 'site_id' not in data:
|
||||
return jsonify({
|
||||
'error': 'Bad request',
|
||||
'message': 'Missing required field: site_id'
|
||||
}), 400
|
||||
|
||||
site_id = data['site_id']
|
||||
|
||||
# Add site to config
|
||||
config_service = ConfigService(db_session=current_app.db_session)
|
||||
config = config_service.add_site_to_config(config_id, site_id)
|
||||
|
||||
logger.info(f"Added site {site_id} to config {config_id}")
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'config': config
|
||||
})
|
||||
|
||||
except ValueError as e:
|
||||
if 'not found' in str(e).lower():
|
||||
logger.warning(f"Config or site not found: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': str(e)
|
||||
}), 404
|
||||
else:
|
||||
logger.warning(f"Validation error: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Validation error',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error adding site to config: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:config_id>/sites/<int:site_id>', methods=['DELETE'])
|
||||
@api_auth_required
|
||||
def remove_site_from_config(config_id: int, site_id: int):
|
||||
"""
|
||||
Remove a site from a config.
|
||||
|
||||
Args:
|
||||
config_id: Configuration ID
|
||||
site_id: Site ID to remove
|
||||
|
||||
Returns:
|
||||
JSON response with updated config:
|
||||
{
|
||||
"success": true,
|
||||
"config": {...}
|
||||
}
|
||||
|
||||
Error responses:
|
||||
- 400: Validation error (e.g., last site cannot be removed)
|
||||
- 404: Config not found or site not in config
|
||||
- 500: Internal server error
|
||||
"""
|
||||
try:
|
||||
config_service = ConfigService(db_session=current_app.db_session)
|
||||
config = config_service.remove_site_from_config(config_id, site_id)
|
||||
|
||||
logger.info(f"Removed site {site_id} from config {config_id}")
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'config': config
|
||||
})
|
||||
|
||||
except ValueError as e:
|
||||
if 'not found' in str(e).lower() or 'not in this config' in str(e).lower():
|
||||
logger.warning(f"Config or site not found: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': str(e)
|
||||
}), 404
|
||||
else:
|
||||
logger.warning(f"Validation error: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Validation error',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error removing site from config: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
|
||||
@@ -5,19 +5,107 @@ Handles endpoints for triggering scans, listing scan history, and retrieving
|
||||
scan results.
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
|
||||
from flask import Blueprint, current_app, jsonify, request
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
|
||||
from web.auth.decorators import api_auth_required
|
||||
from web.models import Scan, ScanProgress
|
||||
from web.services.scan_service import ScanService
|
||||
from web.utils.validators import validate_config_file
|
||||
from web.utils.pagination import validate_page_params
|
||||
from web.jobs.scan_job import stop_scan
|
||||
|
||||
bp = Blueprint('scans', __name__)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _recover_orphaned_scan(scan: Scan, session) -> dict:
|
||||
"""
|
||||
Recover an orphaned scan by checking for output files.
|
||||
|
||||
If output files exist: mark as 'completed' (smart recovery)
|
||||
If no output files: mark as 'cancelled'
|
||||
|
||||
Args:
|
||||
scan: The orphaned Scan object
|
||||
session: Database session
|
||||
|
||||
Returns:
|
||||
Dictionary with recovery result for API response
|
||||
"""
|
||||
# Check for existing output files
|
||||
output_exists = False
|
||||
output_files_found = []
|
||||
|
||||
# Check paths stored in database
|
||||
if scan.json_path and Path(scan.json_path).exists():
|
||||
output_exists = True
|
||||
output_files_found.append('json')
|
||||
if scan.html_path and Path(scan.html_path).exists():
|
||||
output_files_found.append('html')
|
||||
if scan.zip_path and Path(scan.zip_path).exists():
|
||||
output_files_found.append('zip')
|
||||
|
||||
# Also check by timestamp pattern if paths not stored yet
|
||||
if not output_exists and scan.started_at:
|
||||
output_dir = Path('/app/output')
|
||||
if output_dir.exists():
|
||||
timestamp_pattern = scan.started_at.strftime('%Y%m%d')
|
||||
for json_file in output_dir.glob(f'scan_report_{timestamp_pattern}*.json'):
|
||||
output_exists = True
|
||||
output_files_found.append('json')
|
||||
# Update scan record with found paths
|
||||
scan.json_path = str(json_file)
|
||||
html_file = json_file.with_suffix('.html')
|
||||
if html_file.exists():
|
||||
scan.html_path = str(html_file)
|
||||
output_files_found.append('html')
|
||||
zip_file = json_file.with_suffix('.zip')
|
||||
if zip_file.exists():
|
||||
scan.zip_path = str(zip_file)
|
||||
output_files_found.append('zip')
|
||||
break
|
||||
|
||||
if output_exists:
|
||||
# Smart recovery: outputs exist, mark as completed
|
||||
scan.status = 'completed'
|
||||
scan.completed_at = datetime.utcnow()
|
||||
if scan.started_at:
|
||||
scan.duration = (datetime.utcnow() - scan.started_at).total_seconds()
|
||||
scan.error_message = None
|
||||
session.commit()
|
||||
|
||||
logger.info(f"Scan {scan.id}: Recovered as completed (files: {output_files_found})")
|
||||
|
||||
return {
|
||||
'scan_id': scan.id,
|
||||
'status': 'completed',
|
||||
'message': f'Scan recovered as completed (output files found: {", ".join(output_files_found)})',
|
||||
'recovery_type': 'smart_recovery'
|
||||
}
|
||||
else:
|
||||
# No outputs: mark as cancelled
|
||||
scan.status = 'cancelled'
|
||||
scan.completed_at = datetime.utcnow()
|
||||
if scan.started_at:
|
||||
scan.duration = (datetime.utcnow() - scan.started_at).total_seconds()
|
||||
scan.error_message = 'Scan process was interrupted before completion. No output files were generated.'
|
||||
session.commit()
|
||||
|
||||
logger.info(f"Scan {scan.id}: Marked as cancelled (orphaned, no output files)")
|
||||
|
||||
return {
|
||||
'scan_id': scan.id,
|
||||
'status': 'cancelled',
|
||||
'message': 'Orphaned scan cancelled (no output files found)',
|
||||
'recovery_type': 'orphan_cleanup'
|
||||
}
|
||||
|
||||
|
||||
@bp.route('', methods=['GET'])
|
||||
@api_auth_required
|
||||
def list_scans():
|
||||
@@ -129,7 +217,7 @@ def trigger_scan():
|
||||
Trigger a new scan.
|
||||
|
||||
Request body:
|
||||
config_file: Path to YAML config file
|
||||
config_id: Database config ID (required)
|
||||
|
||||
Returns:
|
||||
JSON response with scan_id and status
|
||||
@@ -137,25 +225,35 @@ def trigger_scan():
|
||||
try:
|
||||
# Get request data
|
||||
data = request.get_json() or {}
|
||||
config_file = data.get('config_file')
|
||||
config_id = data.get('config_id')
|
||||
|
||||
# Validate required fields
|
||||
if not config_file:
|
||||
logger.warning("Scan trigger request missing config_file")
|
||||
if not config_id:
|
||||
logger.warning("Scan trigger request missing config_id")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': 'config_file is required'
|
||||
'message': 'config_id is required'
|
||||
}), 400
|
||||
|
||||
# Validate config_id is an integer
|
||||
try:
|
||||
config_id = int(config_id)
|
||||
except (TypeError, ValueError):
|
||||
logger.warning(f"Invalid config_id type: {config_id}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': 'config_id must be an integer'
|
||||
}), 400
|
||||
|
||||
# Trigger scan via service
|
||||
scan_service = ScanService(current_app.db_session)
|
||||
scan_id = scan_service.trigger_scan(
|
||||
config_file=config_file,
|
||||
config_id=config_id,
|
||||
triggered_by='api',
|
||||
scheduler=current_app.scheduler
|
||||
)
|
||||
|
||||
logger.info(f"Scan {scan_id} triggered via API: config={config_file}")
|
||||
logger.info(f"Scan {scan_id} triggered via API: config_id={config_id}")
|
||||
|
||||
return jsonify({
|
||||
'scan_id': scan_id,
|
||||
@@ -164,10 +262,10 @@ def trigger_scan():
|
||||
}), 201
|
||||
|
||||
except ValueError as e:
|
||||
# Config file validation error
|
||||
# Config validation error
|
||||
error_message = str(e)
|
||||
logger.warning(f"Invalid config file: {error_message}")
|
||||
logger.warning(f"Request data: config_file='{config_file}'")
|
||||
logger.warning(f"Invalid config: {error_message}")
|
||||
logger.warning(f"Request data: config_id='{config_id}'")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': error_message
|
||||
@@ -231,6 +329,77 @@ def delete_scan(scan_id):
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:scan_id>/stop', methods=['POST'])
|
||||
@api_auth_required
|
||||
def stop_running_scan(scan_id):
|
||||
"""
|
||||
Stop a running scan with smart recovery for orphaned scans.
|
||||
|
||||
If the scan is actively running in the registry, sends a cancel signal.
|
||||
If the scan shows as running/finalizing but is not in the registry (orphaned),
|
||||
performs smart recovery: marks as 'completed' if output files exist,
|
||||
otherwise marks as 'cancelled'.
|
||||
|
||||
Args:
|
||||
scan_id: Scan ID to stop
|
||||
|
||||
Returns:
|
||||
JSON response with stop status or recovery result
|
||||
"""
|
||||
try:
|
||||
session = current_app.db_session
|
||||
|
||||
# Check if scan exists
|
||||
scan = session.query(Scan).filter_by(id=scan_id).first()
|
||||
if not scan:
|
||||
logger.warning(f"Scan not found for stop request: {scan_id}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': f'Scan with ID {scan_id} not found'
|
||||
}), 404
|
||||
|
||||
# Allow stopping scans with status 'running' or 'finalizing'
|
||||
if scan.status not in ('running', 'finalizing'):
|
||||
logger.warning(f"Cannot stop scan {scan_id}: status is '{scan.status}'")
|
||||
return jsonify({
|
||||
'error': 'Invalid state',
|
||||
'message': f"Cannot stop scan: status is '{scan.status}'"
|
||||
}), 400
|
||||
|
||||
# Get database URL from app config
|
||||
db_url = current_app.config['SQLALCHEMY_DATABASE_URI']
|
||||
|
||||
# Attempt to stop the scan
|
||||
stopped = stop_scan(scan_id, db_url)
|
||||
|
||||
if stopped:
|
||||
logger.info(f"Stop signal sent to scan {scan_id}")
|
||||
return jsonify({
|
||||
'scan_id': scan_id,
|
||||
'message': 'Stop signal sent to scan',
|
||||
'status': 'stopping'
|
||||
}), 200
|
||||
else:
|
||||
# Scanner not in registry - this is an orphaned scan
|
||||
# Attempt smart recovery
|
||||
logger.warning(f"Scan {scan_id} not in registry, attempting smart recovery")
|
||||
recovery_result = _recover_orphaned_scan(scan, session)
|
||||
return jsonify(recovery_result), 200
|
||||
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error stopping scan {scan_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to stop scan'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error stopping scan {scan_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:scan_id>/status', methods=['GET'])
|
||||
@api_auth_required
|
||||
def get_scan_status(scan_id):
|
||||
@@ -272,6 +441,141 @@ def get_scan_status(scan_id):
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:scan_id>/progress', methods=['GET'])
|
||||
@api_auth_required
|
||||
def get_scan_progress(scan_id):
|
||||
"""
|
||||
Get detailed progress for a running scan including per-IP results.
|
||||
|
||||
Args:
|
||||
scan_id: Scan ID
|
||||
|
||||
Returns:
|
||||
JSON response with scan progress including:
|
||||
- current_phase: Current scan phase
|
||||
- total_ips: Total IPs being scanned
|
||||
- completed_ips: Number of IPs completed in current phase
|
||||
- progress_entries: List of per-IP progress with discovered results
|
||||
"""
|
||||
try:
|
||||
session = current_app.db_session
|
||||
|
||||
# Get scan record
|
||||
scan = session.query(Scan).filter_by(id=scan_id).first()
|
||||
if not scan:
|
||||
logger.warning(f"Scan not found for progress check: {scan_id}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': f'Scan with ID {scan_id} not found'
|
||||
}), 404
|
||||
|
||||
# Get progress entries
|
||||
progress_entries = session.query(ScanProgress).filter_by(scan_id=scan_id).all()
|
||||
|
||||
# Build progress data
|
||||
entries = []
|
||||
for entry in progress_entries:
|
||||
entry_data = {
|
||||
'ip_address': entry.ip_address,
|
||||
'site_name': entry.site_name,
|
||||
'phase': entry.phase,
|
||||
'status': entry.status,
|
||||
'ping_result': entry.ping_result
|
||||
}
|
||||
|
||||
# Parse JSON fields
|
||||
if entry.tcp_ports:
|
||||
entry_data['tcp_ports'] = json.loads(entry.tcp_ports)
|
||||
else:
|
||||
entry_data['tcp_ports'] = []
|
||||
|
||||
if entry.udp_ports:
|
||||
entry_data['udp_ports'] = json.loads(entry.udp_ports)
|
||||
else:
|
||||
entry_data['udp_ports'] = []
|
||||
|
||||
if entry.services:
|
||||
entry_data['services'] = json.loads(entry.services)
|
||||
else:
|
||||
entry_data['services'] = []
|
||||
|
||||
entries.append(entry_data)
|
||||
|
||||
# Sort entries by site name then IP (numerically)
|
||||
def ip_sort_key(ip_str):
|
||||
"""Convert IP to tuple of integers for proper numeric sorting."""
|
||||
try:
|
||||
return tuple(int(octet) for octet in ip_str.split('.'))
|
||||
except (ValueError, AttributeError):
|
||||
return (0, 0, 0, 0)
|
||||
|
||||
entries.sort(key=lambda x: (x['site_name'] or '', ip_sort_key(x['ip_address'])))
|
||||
|
||||
response = {
|
||||
'scan_id': scan_id,
|
||||
'status': scan.status,
|
||||
'current_phase': scan.current_phase or 'pending',
|
||||
'total_ips': scan.total_ips or 0,
|
||||
'completed_ips': scan.completed_ips or 0,
|
||||
'progress_entries': entries
|
||||
}
|
||||
|
||||
logger.debug(f"Retrieved progress for scan {scan_id}: phase={scan.current_phase}, {scan.completed_ips}/{scan.total_ips} IPs")
|
||||
return jsonify(response)
|
||||
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error retrieving scan progress {scan_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to retrieve scan progress'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error retrieving scan progress {scan_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/by-ip/<ip_address>', methods=['GET'])
|
||||
@api_auth_required
|
||||
def get_scans_by_ip(ip_address):
|
||||
"""
|
||||
Get last 10 scans containing a specific IP address.
|
||||
|
||||
Args:
|
||||
ip_address: IP address to search for
|
||||
|
||||
Returns:
|
||||
JSON response with list of scans containing the IP
|
||||
"""
|
||||
try:
|
||||
# Get scans from service
|
||||
scan_service = ScanService(current_app.db_session)
|
||||
scans = scan_service.get_scans_by_ip(ip_address)
|
||||
|
||||
logger.info(f"Retrieved {len(scans)} scans for IP: {ip_address}")
|
||||
|
||||
return jsonify({
|
||||
'ip_address': ip_address,
|
||||
'scans': scans,
|
||||
'count': len(scans)
|
||||
})
|
||||
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error retrieving scans for IP {ip_address}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to retrieve scans'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error retrieving scans for IP {ip_address}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:scan_id1>/compare/<int:scan_id2>', methods=['GET'])
|
||||
@api_auth_required
|
||||
def compare_scans(scan_id1, scan_id2):
|
||||
|
||||
@@ -88,7 +88,7 @@ def create_schedule():
|
||||
|
||||
Request body:
|
||||
name: Schedule name (required)
|
||||
config_file: Path to YAML config (required)
|
||||
config_id: Database config ID (required)
|
||||
cron_expression: Cron expression (required, e.g., '0 2 * * *')
|
||||
enabled: Whether schedule is active (optional, default: true)
|
||||
|
||||
@@ -99,7 +99,7 @@ def create_schedule():
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Validate required fields
|
||||
required = ['name', 'config_file', 'cron_expression']
|
||||
required = ['name', 'config_id', 'cron_expression']
|
||||
missing = [field for field in required if field not in data]
|
||||
if missing:
|
||||
return jsonify({'error': f'Missing required fields: {", ".join(missing)}'}), 400
|
||||
@@ -108,7 +108,7 @@ def create_schedule():
|
||||
schedule_service = ScheduleService(current_app.db_session)
|
||||
schedule_id = schedule_service.create_schedule(
|
||||
name=data['name'],
|
||||
config_file=data['config_file'],
|
||||
config_id=data['config_id'],
|
||||
cron_expression=data['cron_expression'],
|
||||
enabled=data.get('enabled', True)
|
||||
)
|
||||
@@ -121,7 +121,7 @@ def create_schedule():
|
||||
try:
|
||||
current_app.scheduler.add_scheduled_scan(
|
||||
schedule_id=schedule_id,
|
||||
config_file=schedule['config_file'],
|
||||
config_id=schedule['config_id'],
|
||||
cron_expression=schedule['cron_expression']
|
||||
)
|
||||
logger.info(f"Schedule {schedule_id} added to APScheduler")
|
||||
@@ -154,7 +154,7 @@ def update_schedule(schedule_id):
|
||||
|
||||
Request body:
|
||||
name: Schedule name (optional)
|
||||
config_file: Path to YAML config (optional)
|
||||
config_id: Database config ID (optional)
|
||||
cron_expression: Cron expression (optional)
|
||||
enabled: Whether schedule is active (optional)
|
||||
|
||||
@@ -181,7 +181,7 @@ def update_schedule(schedule_id):
|
||||
try:
|
||||
# If cron expression or config changed, or enabled status changed
|
||||
cron_changed = 'cron_expression' in data
|
||||
config_changed = 'config_file' in data
|
||||
config_changed = 'config_id' in data
|
||||
enabled_changed = 'enabled' in data
|
||||
|
||||
if enabled_changed:
|
||||
@@ -189,7 +189,7 @@ def update_schedule(schedule_id):
|
||||
# Re-add to scheduler (replaces existing)
|
||||
current_app.scheduler.add_scheduled_scan(
|
||||
schedule_id=schedule_id,
|
||||
config_file=updated_schedule['config_file'],
|
||||
config_id=updated_schedule['config_id'],
|
||||
cron_expression=updated_schedule['cron_expression']
|
||||
)
|
||||
logger.info(f"Schedule {schedule_id} enabled and added to APScheduler")
|
||||
@@ -201,7 +201,7 @@ def update_schedule(schedule_id):
|
||||
# Reload schedule in APScheduler
|
||||
current_app.scheduler.add_scheduled_scan(
|
||||
schedule_id=schedule_id,
|
||||
config_file=updated_schedule['config_file'],
|
||||
config_id=updated_schedule['config_id'],
|
||||
cron_expression=updated_schedule['cron_expression']
|
||||
)
|
||||
logger.info(f"Schedule {schedule_id} reloaded in APScheduler")
|
||||
@@ -293,7 +293,7 @@ def trigger_schedule(schedule_id):
|
||||
scheduler = current_app.scheduler if hasattr(current_app, 'scheduler') else None
|
||||
|
||||
scan_id = scan_service.trigger_scan(
|
||||
config_file=schedule['config_file'],
|
||||
config_id=schedule['config_id'],
|
||||
triggered_by='manual',
|
||||
schedule_id=schedule_id,
|
||||
scheduler=scheduler
|
||||
|
||||
@@ -75,6 +75,12 @@ def update_settings():
|
||||
'status': 'success',
|
||||
'message': f'Updated {len(settings_dict)} settings'
|
||||
})
|
||||
except ValueError as e:
|
||||
# Handle read-only setting attempts
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': str(e)
|
||||
}), 403
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Failed to update settings: {e}")
|
||||
return jsonify({
|
||||
@@ -112,7 +118,8 @@ def get_setting(key):
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'key': key,
|
||||
'value': value
|
||||
'value': value,
|
||||
'read_only': settings_manager._is_read_only(key)
|
||||
})
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Failed to retrieve setting {key}: {e}")
|
||||
@@ -154,6 +161,12 @@ def update_setting(key):
|
||||
'status': 'success',
|
||||
'message': f'Setting "{key}" updated'
|
||||
})
|
||||
except ValueError as e:
|
||||
# Handle read-only setting attempts
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': str(e)
|
||||
}), 403
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Failed to update setting {key}: {e}")
|
||||
return jsonify({
|
||||
@@ -176,6 +189,14 @@ def delete_setting(key):
|
||||
"""
|
||||
try:
|
||||
settings_manager = get_settings_manager()
|
||||
|
||||
# Prevent deletion of read-only settings
|
||||
if settings_manager._is_read_only(key):
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Setting "{key}" is read-only and cannot be deleted'
|
||||
}), 403
|
||||
|
||||
deleted = settings_manager.delete(key)
|
||||
|
||||
if not deleted:
|
||||
|
||||
661
app/web/api/sites.py
Normal file
661
app/web/api/sites.py
Normal file
@@ -0,0 +1,661 @@
|
||||
"""
|
||||
Sites API blueprint.
|
||||
|
||||
Handles endpoints for managing reusable site definitions, including CIDR ranges
|
||||
and IP-level overrides.
|
||||
"""
|
||||
|
||||
import logging
|
||||
from flask import Blueprint, current_app, jsonify, request
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
|
||||
from web.auth.decorators import api_auth_required
|
||||
from web.services.site_service import SiteService
|
||||
from web.utils.pagination import validate_page_params
|
||||
|
||||
bp = Blueprint('sites', __name__)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@bp.route('', methods=['GET'])
|
||||
@api_auth_required
|
||||
def list_sites():
|
||||
"""
|
||||
List all sites with pagination.
|
||||
|
||||
Query params:
|
||||
page: Page number (default: 1)
|
||||
per_page: Items per page (default: 20, max: 100)
|
||||
all: If 'true', returns all sites without pagination (for dropdowns)
|
||||
|
||||
Returns:
|
||||
JSON response with sites list and pagination info
|
||||
"""
|
||||
try:
|
||||
# Check if requesting all sites (no pagination)
|
||||
if request.args.get('all', '').lower() == 'true':
|
||||
site_service = SiteService(current_app.db_session)
|
||||
sites = site_service.list_all_sites()
|
||||
ip_stats = site_service.get_global_ip_stats()
|
||||
|
||||
logger.info(f"Listed all sites (count={len(sites)})")
|
||||
return jsonify({
|
||||
'sites': sites,
|
||||
'total_ips': ip_stats['total_ips'],
|
||||
'unique_ips': ip_stats['unique_ips'],
|
||||
'duplicate_ips': ip_stats['duplicate_ips']
|
||||
})
|
||||
|
||||
# Get and validate query parameters
|
||||
page = request.args.get('page', 1, type=int)
|
||||
per_page = request.args.get('per_page', 20, type=int)
|
||||
|
||||
# Validate pagination params
|
||||
page, per_page = validate_page_params(page, per_page)
|
||||
|
||||
# Get sites from service
|
||||
site_service = SiteService(current_app.db_session)
|
||||
paginated_result = site_service.list_sites(page=page, per_page=per_page)
|
||||
|
||||
logger.info(f"Listed sites: page={page}, per_page={per_page}, total={paginated_result.total}")
|
||||
|
||||
return jsonify({
|
||||
'sites': paginated_result.items,
|
||||
'total': paginated_result.total,
|
||||
'page': paginated_result.page,
|
||||
'per_page': paginated_result.per_page,
|
||||
'total_pages': paginated_result.pages,
|
||||
'has_prev': paginated_result.has_prev,
|
||||
'has_next': paginated_result.has_next
|
||||
})
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Invalid request parameters: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error listing sites: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to retrieve sites'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error listing sites: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:site_id>', methods=['GET'])
|
||||
@api_auth_required
|
||||
def get_site(site_id):
|
||||
"""
|
||||
Get details for a specific site.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
|
||||
Returns:
|
||||
JSON response with site details including CIDRs and IP overrides
|
||||
"""
|
||||
try:
|
||||
site_service = SiteService(current_app.db_session)
|
||||
site = site_service.get_site(site_id)
|
||||
|
||||
if not site:
|
||||
logger.warning(f"Site not found: {site_id}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': f'Site with ID {site_id} not found'
|
||||
}), 404
|
||||
|
||||
logger.info(f"Retrieved site details: {site_id}")
|
||||
return jsonify(site)
|
||||
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error retrieving site {site_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to retrieve site'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error retrieving site {site_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('', methods=['POST'])
|
||||
@api_auth_required
|
||||
def create_site():
|
||||
"""
|
||||
Create a new site.
|
||||
|
||||
Request body:
|
||||
name: Site name (required, must be unique)
|
||||
description: Site description (optional)
|
||||
cidrs: List of CIDR definitions (optional, but recommended)
|
||||
[
|
||||
{
|
||||
"cidr": "10.0.0.0/24",
|
||||
"expected_ping": true,
|
||||
"expected_tcp_ports": [22, 80, 443],
|
||||
"expected_udp_ports": [53]
|
||||
}
|
||||
]
|
||||
|
||||
Returns:
|
||||
JSON response with created site data
|
||||
"""
|
||||
try:
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Validate required fields
|
||||
name = data.get('name')
|
||||
if not name:
|
||||
logger.warning("Site creation request missing name")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': 'name is required'
|
||||
}), 400
|
||||
|
||||
description = data.get('description')
|
||||
|
||||
# Create site (empty initially)
|
||||
site_service = SiteService(current_app.db_session)
|
||||
site = site_service.create_site(
|
||||
name=name,
|
||||
description=description
|
||||
)
|
||||
|
||||
logger.info(f"Created site '{name}' (id={site['id']})")
|
||||
return jsonify(site), 201
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Invalid site creation request: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error creating site: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to create site'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error creating site: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:site_id>', methods=['PUT'])
|
||||
@api_auth_required
|
||||
def update_site(site_id):
|
||||
"""
|
||||
Update site metadata (name and/or description).
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
|
||||
Request body:
|
||||
name: New site name (optional, must be unique)
|
||||
description: New description (optional)
|
||||
|
||||
Returns:
|
||||
JSON response with updated site data
|
||||
"""
|
||||
try:
|
||||
data = request.get_json() or {}
|
||||
|
||||
name = data.get('name')
|
||||
description = data.get('description')
|
||||
|
||||
# Update site
|
||||
site_service = SiteService(current_app.db_session)
|
||||
site = site_service.update_site(
|
||||
site_id=site_id,
|
||||
name=name,
|
||||
description=description
|
||||
)
|
||||
|
||||
logger.info(f"Updated site {site_id}")
|
||||
return jsonify(site)
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Invalid site update request: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error updating site {site_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to update site'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error updating site {site_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:site_id>', methods=['DELETE'])
|
||||
@api_auth_required
|
||||
def delete_site(site_id):
|
||||
"""
|
||||
Delete a site.
|
||||
|
||||
Prevents deletion if site is used in any scan.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
|
||||
Returns:
|
||||
JSON response with success message
|
||||
"""
|
||||
try:
|
||||
site_service = SiteService(current_app.db_session)
|
||||
site_service.delete_site(site_id)
|
||||
|
||||
logger.info(f"Deleted site {site_id}")
|
||||
return jsonify({
|
||||
'message': f'Site {site_id} deleted successfully'
|
||||
})
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Cannot delete site {site_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error deleting site {site_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to delete site'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error deleting site {site_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:site_id>/ips/bulk', methods=['POST'])
|
||||
@api_auth_required
|
||||
def bulk_add_ips(site_id):
|
||||
"""
|
||||
Bulk add IPs to a site from CIDR or list.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
|
||||
Request body:
|
||||
source_type: "cidr" or "list" (required)
|
||||
cidr: CIDR notation if source_type="cidr" (e.g., "10.0.0.0/24")
|
||||
ips: List of IP addresses if source_type="list" (e.g., ["10.0.0.1", "10.0.0.2"])
|
||||
expected_ping: Expected ping response for all IPs (optional)
|
||||
expected_tcp_ports: List of expected TCP ports for all IPs (optional)
|
||||
expected_udp_ports: List of expected UDP ports for all IPs (optional)
|
||||
|
||||
Returns:
|
||||
JSON response with count of IPs added and any errors
|
||||
"""
|
||||
try:
|
||||
data = request.get_json() or {}
|
||||
|
||||
source_type = data.get('source_type')
|
||||
if source_type not in ['cidr', 'list']:
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': 'source_type must be "cidr" or "list"'
|
||||
}), 400
|
||||
|
||||
expected_ping = data.get('expected_ping')
|
||||
expected_tcp_ports = data.get('expected_tcp_ports', [])
|
||||
expected_udp_ports = data.get('expected_udp_ports', [])
|
||||
|
||||
site_service = SiteService(current_app.db_session)
|
||||
|
||||
if source_type == 'cidr':
|
||||
cidr = data.get('cidr')
|
||||
if not cidr:
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': 'cidr is required when source_type="cidr"'
|
||||
}), 400
|
||||
|
||||
result = site_service.bulk_add_ips_from_cidr(
|
||||
site_id=site_id,
|
||||
cidr=cidr,
|
||||
expected_ping=expected_ping,
|
||||
expected_tcp_ports=expected_tcp_ports,
|
||||
expected_udp_ports=expected_udp_ports
|
||||
)
|
||||
|
||||
logger.info(f"Bulk added {result['ip_count']} IPs from CIDR '{cidr}' to site {site_id}")
|
||||
return jsonify(result), 201
|
||||
|
||||
else: # source_type == 'list'
|
||||
ip_list = data.get('ips', [])
|
||||
if not isinstance(ip_list, list):
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': 'ips must be a list when source_type="list"'
|
||||
}), 400
|
||||
|
||||
result = site_service.bulk_add_ips_from_list(
|
||||
site_id=site_id,
|
||||
ip_list=ip_list,
|
||||
expected_ping=expected_ping,
|
||||
expected_tcp_ports=expected_tcp_ports,
|
||||
expected_udp_ports=expected_udp_ports
|
||||
)
|
||||
|
||||
logger.info(f"Bulk added {result['ip_count']} IPs from list to site {site_id}")
|
||||
return jsonify(result), 201
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Invalid bulk IP request: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error bulk adding IPs to site {site_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to add IPs'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error bulk adding IPs to site {site_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:site_id>/ips', methods=['GET'])
|
||||
@api_auth_required
|
||||
def list_ips(site_id):
|
||||
"""
|
||||
List IPs in a site with pagination.
|
||||
|
||||
Query params:
|
||||
page: Page number (default: 1)
|
||||
per_page: Items per page (default: 50, max: 200)
|
||||
|
||||
Returns:
|
||||
JSON response with IPs list and pagination info
|
||||
"""
|
||||
try:
|
||||
# Get and validate query parameters
|
||||
page = request.args.get('page', 1, type=int)
|
||||
per_page = request.args.get('per_page', 50, type=int)
|
||||
|
||||
# Validate pagination params
|
||||
page, per_page = validate_page_params(page, per_page, max_per_page=200)
|
||||
|
||||
# Get IPs from service
|
||||
site_service = SiteService(current_app.db_session)
|
||||
paginated_result = site_service.list_ips(
|
||||
site_id=site_id,
|
||||
page=page,
|
||||
per_page=per_page
|
||||
)
|
||||
|
||||
logger.info(f"Listed IPs for site {site_id}: page={page}, per_page={per_page}, total={paginated_result.total}")
|
||||
|
||||
return jsonify({
|
||||
'ips': paginated_result.items,
|
||||
'total': paginated_result.total,
|
||||
'page': paginated_result.page,
|
||||
'per_page': paginated_result.per_page,
|
||||
'total_pages': paginated_result.pages,
|
||||
'has_prev': paginated_result.has_prev,
|
||||
'has_next': paginated_result.has_next
|
||||
})
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Invalid request parameters: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error listing IPs for site {site_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to retrieve IPs'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error listing IPs for site {site_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:site_id>/ips', methods=['POST'])
|
||||
@api_auth_required
|
||||
def add_standalone_ip(site_id):
|
||||
"""
|
||||
Add a standalone IP (without CIDR parent) to a site.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
|
||||
Request body:
|
||||
ip_address: IP address (required)
|
||||
expected_ping: Expected ping response (optional)
|
||||
expected_tcp_ports: List of expected TCP ports (optional)
|
||||
expected_udp_ports: List of expected UDP ports (optional)
|
||||
|
||||
Returns:
|
||||
JSON response with created IP data
|
||||
"""
|
||||
try:
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Validate required fields
|
||||
ip_address = data.get('ip_address')
|
||||
if not ip_address:
|
||||
logger.warning("Standalone IP creation request missing ip_address")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': 'ip_address is required'
|
||||
}), 400
|
||||
|
||||
expected_ping = data.get('expected_ping')
|
||||
expected_tcp_ports = data.get('expected_tcp_ports', [])
|
||||
expected_udp_ports = data.get('expected_udp_ports', [])
|
||||
|
||||
# Add standalone IP
|
||||
site_service = SiteService(current_app.db_session)
|
||||
ip_data = site_service.add_standalone_ip(
|
||||
site_id=site_id,
|
||||
ip_address=ip_address,
|
||||
expected_ping=expected_ping,
|
||||
expected_tcp_ports=expected_tcp_ports,
|
||||
expected_udp_ports=expected_udp_ports
|
||||
)
|
||||
|
||||
logger.info(f"Added standalone IP '{ip_address}' to site {site_id}")
|
||||
return jsonify(ip_data), 201
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Invalid standalone IP creation request: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error adding standalone IP to site {site_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to add IP'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error adding standalone IP to site {site_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:site_id>/ips/<int:ip_id>', methods=['PUT'])
|
||||
@api_auth_required
|
||||
def update_ip_settings(site_id, ip_id):
|
||||
"""
|
||||
Update settings for an individual IP.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
ip_id: IP ID
|
||||
|
||||
Request body:
|
||||
expected_ping: New ping expectation (optional)
|
||||
expected_tcp_ports: New TCP ports expectation (optional)
|
||||
expected_udp_ports: New UDP ports expectation (optional)
|
||||
|
||||
Returns:
|
||||
JSON response with updated IP data
|
||||
"""
|
||||
try:
|
||||
data = request.get_json() or {}
|
||||
|
||||
expected_ping = data.get('expected_ping')
|
||||
expected_tcp_ports = data.get('expected_tcp_ports')
|
||||
expected_udp_ports = data.get('expected_udp_ports')
|
||||
|
||||
# Update IP settings
|
||||
site_service = SiteService(current_app.db_session)
|
||||
ip_data = site_service.update_ip_settings(
|
||||
site_id=site_id,
|
||||
ip_id=ip_id,
|
||||
expected_ping=expected_ping,
|
||||
expected_tcp_ports=expected_tcp_ports,
|
||||
expected_udp_ports=expected_udp_ports
|
||||
)
|
||||
|
||||
logger.info(f"Updated IP {ip_id} in site {site_id}")
|
||||
return jsonify(ip_data)
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Invalid IP update request: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error updating IP {ip_id} in site {site_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to update IP'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error updating IP {ip_id} in site {site_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:site_id>/ips/<int:ip_id>', methods=['DELETE'])
|
||||
@api_auth_required
|
||||
def remove_ip(site_id, ip_id):
|
||||
"""
|
||||
Remove an IP from a site.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
ip_id: IP ID
|
||||
|
||||
Returns:
|
||||
JSON response with success message
|
||||
"""
|
||||
try:
|
||||
site_service = SiteService(current_app.db_session)
|
||||
site_service.remove_ip(site_id, ip_id)
|
||||
|
||||
logger.info(f"Removed IP {ip_id} from site {site_id}")
|
||||
return jsonify({
|
||||
'message': f'IP {ip_id} removed successfully'
|
||||
})
|
||||
|
||||
except ValueError as e:
|
||||
logger.warning(f"Cannot remove IP {ip_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Invalid request',
|
||||
'message': str(e)
|
||||
}), 400
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error removing IP {ip_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to remove IP'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error removing IP {ip_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:site_id>/usage', methods=['GET'])
|
||||
@api_auth_required
|
||||
def get_site_usage(site_id):
|
||||
"""
|
||||
Get list of scans that use this site.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
|
||||
Returns:
|
||||
JSON response with list of scans
|
||||
"""
|
||||
try:
|
||||
site_service = SiteService(current_app.db_session)
|
||||
|
||||
# First check if site exists
|
||||
site = site_service.get_site(site_id)
|
||||
if not site:
|
||||
logger.warning(f"Site not found: {site_id}")
|
||||
return jsonify({
|
||||
'error': 'Not found',
|
||||
'message': f'Site with ID {site_id} not found'
|
||||
}), 404
|
||||
|
||||
scans = site_service.get_scan_usage(site_id)
|
||||
|
||||
logger.info(f"Retrieved usage for site {site_id} (count={len(scans)})")
|
||||
return jsonify({
|
||||
'site_id': site_id,
|
||||
'site_name': site['name'],
|
||||
'scans': scans,
|
||||
'count': len(scans)
|
||||
})
|
||||
|
||||
except SQLAlchemyError as e:
|
||||
logger.error(f"Database error retrieving site usage {site_id}: {str(e)}")
|
||||
return jsonify({
|
||||
'error': 'Database error',
|
||||
'message': 'Failed to retrieve site usage'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error retrieving site usage {site_id}: {str(e)}", exc_info=True)
|
||||
return jsonify({
|
||||
'error': 'Internal server error',
|
||||
'message': 'An unexpected error occurred'
|
||||
}), 500
|
||||
@@ -198,12 +198,12 @@ def scan_history(scan_id):
|
||||
if not reference_scan:
|
||||
return jsonify({'error': 'Scan not found'}), 404
|
||||
|
||||
config_file = reference_scan.config_file
|
||||
config_id = reference_scan.config_id
|
||||
|
||||
# Query historical scans with the same config file
|
||||
# Query historical scans with the same config_id
|
||||
historical_scans = (
|
||||
db_session.query(Scan)
|
||||
.filter(Scan.config_file == config_file)
|
||||
.filter(Scan.config_id == config_id)
|
||||
.filter(Scan.status == 'completed')
|
||||
.order_by(Scan.timestamp.desc())
|
||||
.limit(limit)
|
||||
@@ -247,7 +247,7 @@ def scan_history(scan_id):
|
||||
'scans': scans_data,
|
||||
'labels': labels,
|
||||
'port_counts': port_counts,
|
||||
'config_file': config_file
|
||||
'config_id': config_id
|
||||
}), 200
|
||||
|
||||
except SQLAlchemyError as e:
|
||||
|
||||
677
app/web/api/webhooks.py
Normal file
677
app/web/api/webhooks.py
Normal file
@@ -0,0 +1,677 @@
|
||||
"""
|
||||
Webhooks API blueprint.
|
||||
|
||||
Handles endpoints for managing webhook configurations and viewing delivery logs.
|
||||
"""
|
||||
|
||||
import json
|
||||
from datetime import datetime, timezone
|
||||
from flask import Blueprint, jsonify, request, current_app
|
||||
|
||||
from web.auth.decorators import api_auth_required
|
||||
from web.models import Webhook, WebhookDeliveryLog, Alert
|
||||
from web.services.webhook_service import WebhookService
|
||||
from web.services.template_service import get_template_service
|
||||
|
||||
bp = Blueprint('webhooks_api', __name__)
|
||||
|
||||
|
||||
@bp.route('', methods=['GET'])
|
||||
@api_auth_required
|
||||
def list_webhooks():
|
||||
"""
|
||||
List all webhooks with optional filtering.
|
||||
|
||||
Query params:
|
||||
page: Page number (default: 1)
|
||||
per_page: Items per page (default: 20)
|
||||
enabled: Filter by enabled status (true/false)
|
||||
|
||||
Returns:
|
||||
JSON response with webhooks list
|
||||
"""
|
||||
# Get query parameters
|
||||
page = request.args.get('page', 1, type=int)
|
||||
per_page = min(request.args.get('per_page', 20, type=int), 100) # Max 100 items
|
||||
enabled = request.args.get('enabled')
|
||||
|
||||
# Build query
|
||||
query = current_app.db_session.query(Webhook)
|
||||
|
||||
# Apply enabled filter
|
||||
if enabled is not None:
|
||||
enabled_bool = enabled.lower() == 'true'
|
||||
query = query.filter(Webhook.enabled == enabled_bool)
|
||||
|
||||
# Order by name
|
||||
query = query.order_by(Webhook.name)
|
||||
|
||||
# Paginate
|
||||
total = query.count()
|
||||
webhooks = query.offset((page - 1) * per_page).limit(per_page).all()
|
||||
|
||||
# Format response
|
||||
webhooks_data = []
|
||||
|
||||
for webhook in webhooks:
|
||||
# Parse JSON fields
|
||||
alert_types = json.loads(webhook.alert_types) if webhook.alert_types else None
|
||||
severity_filter = json.loads(webhook.severity_filter) if webhook.severity_filter else None
|
||||
custom_headers = json.loads(webhook.custom_headers) if webhook.custom_headers else None
|
||||
|
||||
webhooks_data.append({
|
||||
'id': webhook.id,
|
||||
'name': webhook.name,
|
||||
'url': webhook.url,
|
||||
'enabled': webhook.enabled,
|
||||
'auth_type': webhook.auth_type,
|
||||
'auth_token': '***ENCRYPTED***' if webhook.auth_token else None, # Mask sensitive data
|
||||
'custom_headers': custom_headers,
|
||||
'alert_types': alert_types,
|
||||
'severity_filter': severity_filter,
|
||||
'timeout': webhook.timeout,
|
||||
'retry_count': webhook.retry_count,
|
||||
'created_at': webhook.created_at.isoformat() if webhook.created_at else None,
|
||||
'updated_at': webhook.updated_at.isoformat() if webhook.updated_at else None
|
||||
})
|
||||
|
||||
return jsonify({
|
||||
'webhooks': webhooks_data,
|
||||
'total': total,
|
||||
'page': page,
|
||||
'per_page': per_page,
|
||||
'pages': (total + per_page - 1) // per_page
|
||||
})
|
||||
|
||||
|
||||
@bp.route('/<int:webhook_id>', methods=['GET'])
|
||||
@api_auth_required
|
||||
def get_webhook(webhook_id):
|
||||
"""
|
||||
Get a specific webhook by ID.
|
||||
|
||||
Args:
|
||||
webhook_id: Webhook ID
|
||||
|
||||
Returns:
|
||||
JSON response with webhook details
|
||||
"""
|
||||
webhook = current_app.db_session.query(Webhook).filter(Webhook.id == webhook_id).first()
|
||||
|
||||
if not webhook:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Webhook {webhook_id} not found'
|
||||
}), 404
|
||||
|
||||
# Parse JSON fields
|
||||
alert_types = json.loads(webhook.alert_types) if webhook.alert_types else None
|
||||
severity_filter = json.loads(webhook.severity_filter) if webhook.severity_filter else None
|
||||
custom_headers = json.loads(webhook.custom_headers) if webhook.custom_headers else None
|
||||
|
||||
return jsonify({
|
||||
'webhook': {
|
||||
'id': webhook.id,
|
||||
'name': webhook.name,
|
||||
'url': webhook.url,
|
||||
'enabled': webhook.enabled,
|
||||
'auth_type': webhook.auth_type,
|
||||
'auth_token': '***ENCRYPTED***' if webhook.auth_token else None,
|
||||
'custom_headers': custom_headers,
|
||||
'alert_types': alert_types,
|
||||
'severity_filter': severity_filter,
|
||||
'timeout': webhook.timeout,
|
||||
'retry_count': webhook.retry_count,
|
||||
'created_at': webhook.created_at.isoformat() if webhook.created_at else None,
|
||||
'updated_at': webhook.updated_at.isoformat() if webhook.updated_at else None
|
||||
}
|
||||
})
|
||||
|
||||
|
||||
@bp.route('', methods=['POST'])
|
||||
@api_auth_required
|
||||
def create_webhook():
|
||||
"""
|
||||
Create a new webhook.
|
||||
|
||||
Request body:
|
||||
name: Webhook name (required)
|
||||
url: Webhook URL (required)
|
||||
enabled: Whether webhook is enabled (default: true)
|
||||
auth_type: Authentication type (none, bearer, basic, custom)
|
||||
auth_token: Authentication token (encrypted on storage)
|
||||
custom_headers: JSON object with custom headers
|
||||
alert_types: Array of alert types to filter
|
||||
severity_filter: Array of severities to filter
|
||||
timeout: Request timeout in seconds (default: 10)
|
||||
retry_count: Number of retry attempts (default: 3)
|
||||
template: Jinja2 template for custom payload (optional)
|
||||
template_format: Template format - 'json' or 'text' (default: json)
|
||||
content_type_override: Custom Content-Type header (optional)
|
||||
|
||||
Returns:
|
||||
JSON response with created webhook
|
||||
"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Validate required fields
|
||||
if not data.get('name'):
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'name is required'
|
||||
}), 400
|
||||
|
||||
if not data.get('url'):
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'url is required'
|
||||
}), 400
|
||||
|
||||
# Validate auth_type
|
||||
valid_auth_types = ['none', 'bearer', 'basic', 'custom']
|
||||
auth_type = data.get('auth_type', 'none')
|
||||
if auth_type not in valid_auth_types:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Invalid auth_type. Must be one of: {", ".join(valid_auth_types)}'
|
||||
}), 400
|
||||
|
||||
# Validate template_format
|
||||
valid_template_formats = ['json', 'text']
|
||||
template_format = data.get('template_format', 'json')
|
||||
if template_format not in valid_template_formats:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Invalid template_format. Must be one of: {", ".join(valid_template_formats)}'
|
||||
}), 400
|
||||
|
||||
# Validate template if provided
|
||||
template = data.get('template')
|
||||
if template:
|
||||
template_service = get_template_service()
|
||||
is_valid, error_msg = template_service.validate_template(template, template_format)
|
||||
if not is_valid:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Invalid template: {error_msg}'
|
||||
}), 400
|
||||
|
||||
try:
|
||||
webhook_service = WebhookService(current_app.db_session)
|
||||
|
||||
# Encrypt auth_token if provided
|
||||
auth_token = None
|
||||
if data.get('auth_token'):
|
||||
auth_token = webhook_service._encrypt_value(data['auth_token'])
|
||||
|
||||
# Serialize JSON fields
|
||||
alert_types = json.dumps(data['alert_types']) if data.get('alert_types') else None
|
||||
severity_filter = json.dumps(data['severity_filter']) if data.get('severity_filter') else None
|
||||
custom_headers = json.dumps(data['custom_headers']) if data.get('custom_headers') else None
|
||||
|
||||
# Create webhook
|
||||
webhook = Webhook(
|
||||
name=data['name'],
|
||||
url=data['url'],
|
||||
enabled=data.get('enabled', True),
|
||||
auth_type=auth_type,
|
||||
auth_token=auth_token,
|
||||
custom_headers=custom_headers,
|
||||
alert_types=alert_types,
|
||||
severity_filter=severity_filter,
|
||||
timeout=data.get('timeout', 10),
|
||||
retry_count=data.get('retry_count', 3),
|
||||
template=template,
|
||||
template_format=template_format,
|
||||
content_type_override=data.get('content_type_override'),
|
||||
created_at=datetime.now(timezone.utc),
|
||||
updated_at=datetime.now(timezone.utc)
|
||||
)
|
||||
|
||||
current_app.db_session.add(webhook)
|
||||
current_app.db_session.commit()
|
||||
|
||||
# Parse for response
|
||||
alert_types_parsed = json.loads(alert_types) if alert_types else None
|
||||
severity_filter_parsed = json.loads(severity_filter) if severity_filter else None
|
||||
custom_headers_parsed = json.loads(custom_headers) if custom_headers else None
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'Webhook created successfully',
|
||||
'webhook': {
|
||||
'id': webhook.id,
|
||||
'name': webhook.name,
|
||||
'url': webhook.url,
|
||||
'enabled': webhook.enabled,
|
||||
'auth_type': webhook.auth_type,
|
||||
'alert_types': alert_types_parsed,
|
||||
'severity_filter': severity_filter_parsed,
|
||||
'custom_headers': custom_headers_parsed,
|
||||
'timeout': webhook.timeout,
|
||||
'retry_count': webhook.retry_count,
|
||||
'template': webhook.template,
|
||||
'template_format': webhook.template_format,
|
||||
'content_type_override': webhook.content_type_override,
|
||||
'created_at': webhook.created_at.isoformat()
|
||||
}
|
||||
}), 201
|
||||
|
||||
except Exception as e:
|
||||
current_app.db_session.rollback()
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to create webhook: {str(e)}'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:webhook_id>', methods=['PUT'])
|
||||
@api_auth_required
|
||||
def update_webhook(webhook_id):
|
||||
"""
|
||||
Update an existing webhook.
|
||||
|
||||
Args:
|
||||
webhook_id: Webhook ID
|
||||
|
||||
Request body (all optional):
|
||||
name: Webhook name
|
||||
url: Webhook URL
|
||||
enabled: Whether webhook is enabled
|
||||
auth_type: Authentication type
|
||||
auth_token: Authentication token
|
||||
custom_headers: JSON object with custom headers
|
||||
alert_types: Array of alert types
|
||||
severity_filter: Array of severities
|
||||
timeout: Request timeout
|
||||
retry_count: Retry attempts
|
||||
template: Jinja2 template for custom payload
|
||||
template_format: Template format - 'json' or 'text'
|
||||
content_type_override: Custom Content-Type header
|
||||
|
||||
Returns:
|
||||
JSON response with update status
|
||||
"""
|
||||
webhook = current_app.db_session.query(Webhook).filter(Webhook.id == webhook_id).first()
|
||||
|
||||
if not webhook:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Webhook {webhook_id} not found'
|
||||
}), 404
|
||||
|
||||
data = request.get_json() or {}
|
||||
|
||||
# Validate auth_type if provided
|
||||
if 'auth_type' in data:
|
||||
valid_auth_types = ['none', 'bearer', 'basic', 'custom']
|
||||
if data['auth_type'] not in valid_auth_types:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Invalid auth_type. Must be one of: {", ".join(valid_auth_types)}'
|
||||
}), 400
|
||||
|
||||
# Validate template_format if provided
|
||||
if 'template_format' in data:
|
||||
valid_template_formats = ['json', 'text']
|
||||
if data['template_format'] not in valid_template_formats:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Invalid template_format. Must be one of: {", ".join(valid_template_formats)}'
|
||||
}), 400
|
||||
|
||||
# Validate template if provided
|
||||
if 'template' in data and data['template']:
|
||||
template_format = data.get('template_format', webhook.template_format or 'json')
|
||||
template_service = get_template_service()
|
||||
is_valid, error_msg = template_service.validate_template(data['template'], template_format)
|
||||
if not is_valid:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Invalid template: {error_msg}'
|
||||
}), 400
|
||||
|
||||
try:
|
||||
webhook_service = WebhookService(current_app.db_session)
|
||||
|
||||
# Update fields if provided
|
||||
if 'name' in data:
|
||||
webhook.name = data['name']
|
||||
if 'url' in data:
|
||||
webhook.url = data['url']
|
||||
if 'enabled' in data:
|
||||
webhook.enabled = data['enabled']
|
||||
if 'auth_type' in data:
|
||||
webhook.auth_type = data['auth_type']
|
||||
if 'auth_token' in data:
|
||||
# Encrypt new token
|
||||
webhook.auth_token = webhook_service._encrypt_value(data['auth_token'])
|
||||
if 'custom_headers' in data:
|
||||
webhook.custom_headers = json.dumps(data['custom_headers']) if data['custom_headers'] else None
|
||||
if 'alert_types' in data:
|
||||
webhook.alert_types = json.dumps(data['alert_types']) if data['alert_types'] else None
|
||||
if 'severity_filter' in data:
|
||||
webhook.severity_filter = json.dumps(data['severity_filter']) if data['severity_filter'] else None
|
||||
if 'timeout' in data:
|
||||
webhook.timeout = data['timeout']
|
||||
if 'retry_count' in data:
|
||||
webhook.retry_count = data['retry_count']
|
||||
if 'template' in data:
|
||||
webhook.template = data['template']
|
||||
if 'template_format' in data:
|
||||
webhook.template_format = data['template_format']
|
||||
if 'content_type_override' in data:
|
||||
webhook.content_type_override = data['content_type_override']
|
||||
|
||||
webhook.updated_at = datetime.now(timezone.utc)
|
||||
current_app.db_session.commit()
|
||||
|
||||
# Parse for response
|
||||
alert_types = json.loads(webhook.alert_types) if webhook.alert_types else None
|
||||
severity_filter = json.loads(webhook.severity_filter) if webhook.severity_filter else None
|
||||
custom_headers = json.loads(webhook.custom_headers) if webhook.custom_headers else None
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': 'Webhook updated successfully',
|
||||
'webhook': {
|
||||
'id': webhook.id,
|
||||
'name': webhook.name,
|
||||
'url': webhook.url,
|
||||
'enabled': webhook.enabled,
|
||||
'auth_type': webhook.auth_type,
|
||||
'alert_types': alert_types,
|
||||
'severity_filter': severity_filter,
|
||||
'custom_headers': custom_headers,
|
||||
'timeout': webhook.timeout,
|
||||
'retry_count': webhook.retry_count,
|
||||
'template': webhook.template,
|
||||
'template_format': webhook.template_format,
|
||||
'content_type_override': webhook.content_type_override,
|
||||
'updated_at': webhook.updated_at.isoformat()
|
||||
}
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
current_app.db_session.rollback()
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to update webhook: {str(e)}'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:webhook_id>', methods=['DELETE'])
|
||||
@api_auth_required
|
||||
def delete_webhook(webhook_id):
|
||||
"""
|
||||
Delete a webhook.
|
||||
|
||||
Args:
|
||||
webhook_id: Webhook ID
|
||||
|
||||
Returns:
|
||||
JSON response with deletion status
|
||||
"""
|
||||
webhook = current_app.db_session.query(Webhook).filter(Webhook.id == webhook_id).first()
|
||||
|
||||
if not webhook:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Webhook {webhook_id} not found'
|
||||
}), 404
|
||||
|
||||
try:
|
||||
# Delete webhook (delivery logs will be cascade deleted)
|
||||
current_app.db_session.delete(webhook)
|
||||
current_app.db_session.commit()
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'message': f'Webhook {webhook_id} deleted successfully'
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
current_app.db_session.rollback()
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to delete webhook: {str(e)}'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/<int:webhook_id>/test', methods=['POST'])
|
||||
@api_auth_required
|
||||
def test_webhook(webhook_id):
|
||||
"""
|
||||
Send a test payload to a webhook.
|
||||
|
||||
Args:
|
||||
webhook_id: Webhook ID
|
||||
|
||||
Returns:
|
||||
JSON response with test result
|
||||
"""
|
||||
webhook = current_app.db_session.query(Webhook).filter(Webhook.id == webhook_id).first()
|
||||
|
||||
if not webhook:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Webhook {webhook_id} not found'
|
||||
}), 404
|
||||
|
||||
# Test webhook delivery
|
||||
webhook_service = WebhookService(current_app.db_session)
|
||||
result = webhook_service.test_webhook(webhook_id)
|
||||
|
||||
return jsonify({
|
||||
'status': 'success' if result['success'] else 'error',
|
||||
'message': result['message'],
|
||||
'status_code': result['status_code'],
|
||||
'response_body': result.get('response_body')
|
||||
})
|
||||
|
||||
|
||||
@bp.route('/<int:webhook_id>/logs', methods=['GET'])
|
||||
@api_auth_required
|
||||
def get_webhook_logs(webhook_id):
|
||||
"""
|
||||
Get delivery logs for a specific webhook.
|
||||
|
||||
Args:
|
||||
webhook_id: Webhook ID
|
||||
|
||||
Query params:
|
||||
page: Page number (default: 1)
|
||||
per_page: Items per page (default: 20)
|
||||
status: Filter by status (success/failed)
|
||||
|
||||
Returns:
|
||||
JSON response with delivery logs
|
||||
"""
|
||||
webhook = current_app.db_session.query(Webhook).filter(Webhook.id == webhook_id).first()
|
||||
|
||||
if not webhook:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Webhook {webhook_id} not found'
|
||||
}), 404
|
||||
|
||||
# Get query parameters
|
||||
page = request.args.get('page', 1, type=int)
|
||||
per_page = min(request.args.get('per_page', 20, type=int), 100)
|
||||
status_filter = request.args.get('status')
|
||||
|
||||
# Build query
|
||||
query = current_app.db_session.query(WebhookDeliveryLog).filter(
|
||||
WebhookDeliveryLog.webhook_id == webhook_id
|
||||
)
|
||||
|
||||
# Apply status filter
|
||||
if status_filter:
|
||||
query = query.filter(WebhookDeliveryLog.status == status_filter)
|
||||
|
||||
# Order by most recent first
|
||||
query = query.order_by(WebhookDeliveryLog.delivered_at.desc())
|
||||
|
||||
# Paginate
|
||||
total = query.count()
|
||||
logs = query.offset((page - 1) * per_page).limit(per_page).all()
|
||||
|
||||
# Format response
|
||||
logs_data = []
|
||||
for log in logs:
|
||||
# Get alert info
|
||||
alert = current_app.db_session.query(Alert).filter(Alert.id == log.alert_id).first()
|
||||
|
||||
logs_data.append({
|
||||
'id': log.id,
|
||||
'alert_id': log.alert_id,
|
||||
'alert_type': alert.alert_type if alert else None,
|
||||
'alert_message': alert.message if alert else None,
|
||||
'status': log.status,
|
||||
'response_code': log.response_code,
|
||||
'response_body': log.response_body,
|
||||
'error_message': log.error_message,
|
||||
'attempt_number': log.attempt_number,
|
||||
'delivered_at': log.delivered_at.isoformat() if log.delivered_at else None
|
||||
})
|
||||
|
||||
return jsonify({
|
||||
'webhook_id': webhook_id,
|
||||
'webhook_name': webhook.name,
|
||||
'logs': logs_data,
|
||||
'total': total,
|
||||
'page': page,
|
||||
'per_page': per_page,
|
||||
'pages': (total + per_page - 1) // per_page
|
||||
})
|
||||
|
||||
|
||||
@bp.route('/preview-template', methods=['POST'])
|
||||
@api_auth_required
|
||||
def preview_template():
|
||||
"""
|
||||
Preview a webhook template with sample data.
|
||||
|
||||
Request body:
|
||||
template: Jinja2 template string (required)
|
||||
template_format: Template format - 'json' or 'text' (default: json)
|
||||
|
||||
Returns:
|
||||
JSON response with rendered template preview
|
||||
"""
|
||||
data = request.get_json() or {}
|
||||
|
||||
if not data.get('template'):
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'template is required'
|
||||
}), 400
|
||||
|
||||
template = data['template']
|
||||
template_format = data.get('template_format', 'json')
|
||||
|
||||
# Validate template format
|
||||
if template_format not in ['json', 'text']:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': 'Invalid template_format. Must be json or text'
|
||||
}), 400
|
||||
|
||||
try:
|
||||
template_service = get_template_service()
|
||||
|
||||
# Validate template
|
||||
is_valid, error_msg = template_service.validate_template(template, template_format)
|
||||
if not is_valid:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Template validation error: {error_msg}'
|
||||
}), 400
|
||||
|
||||
# Render with sample data
|
||||
rendered, error = template_service.render_test_payload(template, template_format)
|
||||
|
||||
if error:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Template rendering error: {error}'
|
||||
}), 400
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'rendered': rendered,
|
||||
'format': template_format
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to preview template: {str(e)}'
|
||||
}), 500
|
||||
|
||||
|
||||
@bp.route('/template-presets', methods=['GET'])
|
||||
@api_auth_required
|
||||
def get_template_presets():
|
||||
"""
|
||||
Get list of available webhook template presets.
|
||||
|
||||
Returns:
|
||||
JSON response with template presets
|
||||
"""
|
||||
import os
|
||||
|
||||
try:
|
||||
# Load presets manifest
|
||||
presets_file = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
'../templates/webhook_presets/presets.json'
|
||||
)
|
||||
|
||||
with open(presets_file, 'r') as f:
|
||||
presets_manifest = json.load(f)
|
||||
|
||||
# Load template contents for each preset
|
||||
presets_dir = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
'../templates/webhook_presets'
|
||||
)
|
||||
|
||||
for preset in presets_manifest:
|
||||
template_file = os.path.join(presets_dir, preset['file'])
|
||||
with open(template_file, 'r') as f:
|
||||
preset['template'] = f.read()
|
||||
# Remove file reference from response
|
||||
del preset['file']
|
||||
|
||||
return jsonify({
|
||||
'status': 'success',
|
||||
'presets': presets_manifest
|
||||
})
|
||||
|
||||
except FileNotFoundError as e:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Template presets not found: {str(e)}'
|
||||
}), 500
|
||||
except Exception as e:
|
||||
return jsonify({
|
||||
'status': 'error',
|
||||
'message': f'Failed to load template presets: {str(e)}'
|
||||
}), 500
|
||||
|
||||
|
||||
# Health check endpoint
|
||||
@bp.route('/health', methods=['GET'])
|
||||
def health_check():
|
||||
"""
|
||||
Health check endpoint for monitoring.
|
||||
|
||||
Returns:
|
||||
JSON response with API health status
|
||||
"""
|
||||
return jsonify({
|
||||
'status': 'healthy',
|
||||
'api': 'webhooks',
|
||||
'version': '1.0.0-phase5'
|
||||
})
|
||||
@@ -95,6 +95,9 @@ def create_app(config: dict = None) -> Flask:
|
||||
# Register error handlers
|
||||
register_error_handlers(app)
|
||||
|
||||
# Register context processors
|
||||
register_context_processors(app)
|
||||
|
||||
# Add request/response handlers
|
||||
register_request_handlers(app)
|
||||
|
||||
@@ -304,9 +307,12 @@ def init_scheduler(app: Flask) -> None:
|
||||
with app.app_context():
|
||||
# Clean up any orphaned scans from previous crashes/restarts
|
||||
scan_service = ScanService(app.db_session)
|
||||
orphaned_count = scan_service.cleanup_orphaned_scans()
|
||||
if orphaned_count > 0:
|
||||
app.logger.warning(f"Cleaned up {orphaned_count} orphaned scan(s) on startup")
|
||||
cleanup_result = scan_service.cleanup_orphaned_scans()
|
||||
if cleanup_result['total'] > 0:
|
||||
app.logger.warning(
|
||||
f"Cleaned up {cleanup_result['total']} orphaned scan(s) on startup: "
|
||||
f"{cleanup_result['recovered']} recovered, {cleanup_result['failed']} failed"
|
||||
)
|
||||
|
||||
# Load all enabled schedules from database
|
||||
scheduler.load_schedules_on_startup()
|
||||
@@ -328,11 +334,14 @@ def register_blueprints(app: Flask) -> None:
|
||||
from web.api.scans import bp as scans_bp
|
||||
from web.api.schedules import bp as schedules_bp
|
||||
from web.api.alerts import bp as alerts_bp
|
||||
from web.api.webhooks import bp as webhooks_api_bp
|
||||
from web.api.settings import bp as settings_bp
|
||||
from web.api.stats import bp as stats_bp
|
||||
from web.api.configs import bp as configs_bp
|
||||
from web.api.sites import bp as sites_bp
|
||||
from web.auth.routes import bp as auth_bp
|
||||
from web.routes.main import bp as main_bp
|
||||
from web.routes.webhooks import bp as webhooks_bp
|
||||
|
||||
# Register authentication blueprint
|
||||
app.register_blueprint(auth_bp, url_prefix='/auth')
|
||||
@@ -340,13 +349,18 @@ def register_blueprints(app: Flask) -> None:
|
||||
# Register main web routes blueprint
|
||||
app.register_blueprint(main_bp, url_prefix='/')
|
||||
|
||||
# Register webhooks web routes blueprint
|
||||
app.register_blueprint(webhooks_bp, url_prefix='/webhooks')
|
||||
|
||||
# Register API blueprints
|
||||
app.register_blueprint(scans_bp, url_prefix='/api/scans')
|
||||
app.register_blueprint(schedules_bp, url_prefix='/api/schedules')
|
||||
app.register_blueprint(alerts_bp, url_prefix='/api/alerts')
|
||||
app.register_blueprint(webhooks_api_bp, url_prefix='/api/webhooks')
|
||||
app.register_blueprint(settings_bp, url_prefix='/api/settings')
|
||||
app.register_blueprint(stats_bp, url_prefix='/api/stats')
|
||||
app.register_blueprint(configs_bp, url_prefix='/api/configs')
|
||||
app.register_blueprint(sites_bp, url_prefix='/api/sites')
|
||||
|
||||
app.logger.info("Blueprints registered")
|
||||
|
||||
@@ -487,6 +501,35 @@ def register_error_handlers(app: Flask) -> None:
|
||||
return render_template('errors/500.html', error=error), 500
|
||||
|
||||
|
||||
def register_context_processors(app: Flask) -> None:
|
||||
"""
|
||||
Register template context processors.
|
||||
|
||||
Makes common variables available to all templates without having to
|
||||
pass them explicitly in every render_template call.
|
||||
|
||||
Args:
|
||||
app: Flask application instance
|
||||
"""
|
||||
@app.context_processor
|
||||
def inject_app_settings():
|
||||
"""
|
||||
Inject application metadata into all templates.
|
||||
|
||||
Returns:
|
||||
Dictionary of variables to add to template context
|
||||
"""
|
||||
from web.config import APP_NAME, APP_VERSION, REPO_URL
|
||||
|
||||
return {
|
||||
'app_name': APP_NAME,
|
||||
'app_version': APP_VERSION,
|
||||
'repo_url': REPO_URL
|
||||
}
|
||||
|
||||
app.logger.info("Context processors registered")
|
||||
|
||||
|
||||
def register_request_handlers(app: Flask) -> None:
|
||||
"""
|
||||
Register request and response handlers.
|
||||
|
||||
16
app/web/config.py
Normal file
16
app/web/config.py
Normal file
@@ -0,0 +1,16 @@
|
||||
"""
|
||||
Application configuration and metadata.
|
||||
|
||||
Contains version information and other application-level constants
|
||||
that are managed by developers, not stored in the database.
|
||||
"""
|
||||
|
||||
# Application metadata
|
||||
APP_NAME = 'SneakyScanner'
|
||||
APP_VERSION = '1.0.0-beta'
|
||||
|
||||
# Repository URL
|
||||
REPO_URL = 'https://git.sneakygeek.net/sneakygeek/SneakyScan'
|
||||
|
||||
# Scanner settings
|
||||
NMAP_HOST_TIMEOUT = '2m' # Timeout per host for nmap service detection
|
||||
@@ -5,7 +5,9 @@ This module handles the execution of scans in background threads,
|
||||
updating database status and handling errors.
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
import threading
|
||||
import traceback
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
@@ -13,14 +15,170 @@ from pathlib import Path
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
from src.scanner import SneakyScanner
|
||||
from web.models import Scan
|
||||
from src.scanner import SneakyScanner, ScanCancelledError
|
||||
from web.models import Scan, ScanProgress
|
||||
from web.services.scan_service import ScanService
|
||||
from web.services.alert_service import AlertService
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Registry for tracking running scanners (scan_id -> SneakyScanner instance)
|
||||
_running_scanners = {}
|
||||
_running_scanners_lock = threading.Lock()
|
||||
|
||||
def execute_scan(scan_id: int, config_file: str, db_url: str):
|
||||
|
||||
def get_running_scanner(scan_id: int):
|
||||
"""Get a running scanner instance by scan ID."""
|
||||
with _running_scanners_lock:
|
||||
return _running_scanners.get(scan_id)
|
||||
|
||||
|
||||
def stop_scan(scan_id: int, db_url: str) -> bool:
|
||||
"""
|
||||
Stop a running scan.
|
||||
|
||||
Args:
|
||||
scan_id: ID of the scan to stop
|
||||
db_url: Database connection URL
|
||||
|
||||
Returns:
|
||||
True if scan was cancelled, False if not found or already stopped
|
||||
"""
|
||||
logger.info(f"Attempting to stop scan {scan_id}")
|
||||
|
||||
# Get the scanner instance
|
||||
scanner = get_running_scanner(scan_id)
|
||||
if not scanner:
|
||||
logger.warning(f"Scanner for scan {scan_id} not found in registry")
|
||||
return False
|
||||
|
||||
# Cancel the scanner
|
||||
scanner.cancel()
|
||||
logger.info(f"Cancellation signal sent to scan {scan_id}")
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def create_progress_callback(scan_id: int, session):
|
||||
"""
|
||||
Create a progress callback function for updating scan progress in database.
|
||||
|
||||
Args:
|
||||
scan_id: ID of the scan record
|
||||
session: Database session
|
||||
|
||||
Returns:
|
||||
Callback function that accepts (phase, ip, data)
|
||||
"""
|
||||
ip_to_site = {}
|
||||
|
||||
def progress_callback(phase: str, ip: str, data: dict):
|
||||
"""Update scan progress in database."""
|
||||
nonlocal ip_to_site
|
||||
|
||||
try:
|
||||
# Get scan record
|
||||
scan = session.query(Scan).filter_by(id=scan_id).first()
|
||||
if not scan:
|
||||
return
|
||||
|
||||
# Handle initialization phase
|
||||
if phase == 'init':
|
||||
scan.total_ips = data.get('total_ips', 0)
|
||||
scan.completed_ips = 0
|
||||
scan.current_phase = 'ping'
|
||||
ip_to_site = data.get('ip_to_site', {})
|
||||
|
||||
# Create progress entries for all IPs
|
||||
for ip_addr, site_name in ip_to_site.items():
|
||||
progress = ScanProgress(
|
||||
scan_id=scan_id,
|
||||
ip_address=ip_addr,
|
||||
site_name=site_name,
|
||||
phase='pending',
|
||||
status='pending'
|
||||
)
|
||||
session.add(progress)
|
||||
|
||||
session.commit()
|
||||
return
|
||||
|
||||
# Update current phase
|
||||
if data.get('status') == 'starting':
|
||||
scan.current_phase = phase
|
||||
scan.completed_ips = 0
|
||||
session.commit()
|
||||
return
|
||||
|
||||
# Handle phase completion with results
|
||||
if data.get('status') == 'completed':
|
||||
results = data.get('results', {})
|
||||
|
||||
if phase == 'ping':
|
||||
# Update progress entries with ping results
|
||||
for ip_addr, ping_result in results.items():
|
||||
progress = session.query(ScanProgress).filter_by(
|
||||
scan_id=scan_id, ip_address=ip_addr
|
||||
).first()
|
||||
if progress:
|
||||
progress.ping_result = ping_result
|
||||
progress.phase = 'ping'
|
||||
progress.status = 'completed'
|
||||
|
||||
scan.completed_ips = len(results)
|
||||
|
||||
elif phase == 'tcp_scan':
|
||||
# Update progress entries with TCP/UDP port results
|
||||
for ip_addr, port_data in results.items():
|
||||
progress = session.query(ScanProgress).filter_by(
|
||||
scan_id=scan_id, ip_address=ip_addr
|
||||
).first()
|
||||
if progress:
|
||||
progress.tcp_ports = json.dumps(port_data.get('tcp_ports', []))
|
||||
progress.udp_ports = json.dumps(port_data.get('udp_ports', []))
|
||||
progress.phase = 'tcp_scan'
|
||||
progress.status = 'completed'
|
||||
|
||||
scan.completed_ips = len(results)
|
||||
|
||||
elif phase == 'service_detection':
|
||||
# Update progress entries with service detection results
|
||||
for ip_addr, services in results.items():
|
||||
progress = session.query(ScanProgress).filter_by(
|
||||
scan_id=scan_id, ip_address=ip_addr
|
||||
).first()
|
||||
if progress:
|
||||
# Simplify service data for storage
|
||||
service_list = []
|
||||
for svc in services:
|
||||
service_list.append({
|
||||
'port': svc.get('port'),
|
||||
'service': svc.get('service', 'unknown'),
|
||||
'product': svc.get('product', ''),
|
||||
'version': svc.get('version', '')
|
||||
})
|
||||
progress.services = json.dumps(service_list)
|
||||
progress.phase = 'service_detection'
|
||||
progress.status = 'completed'
|
||||
|
||||
scan.completed_ips = len(results)
|
||||
|
||||
elif phase == 'http_analysis':
|
||||
# Mark HTTP analysis as complete
|
||||
scan.current_phase = 'completed'
|
||||
scan.completed_ips = scan.total_ips
|
||||
|
||||
session.commit()
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Progress callback error for scan {scan_id}: {str(e)}")
|
||||
# Don't re-raise - we don't want to break the scan
|
||||
session.rollback()
|
||||
|
||||
return progress_callback
|
||||
|
||||
|
||||
def execute_scan(scan_id: int, config_id: int, db_url: str = None):
|
||||
"""
|
||||
Execute a scan in the background.
|
||||
|
||||
@@ -30,7 +188,7 @@ def execute_scan(scan_id: int, config_file: str, db_url: str):
|
||||
|
||||
Args:
|
||||
scan_id: ID of the scan record in database
|
||||
config_file: Path to YAML configuration file
|
||||
config_id: Database config ID
|
||||
db_url: Database connection URL
|
||||
|
||||
Workflow:
|
||||
@@ -41,7 +199,7 @@ def execute_scan(scan_id: int, config_file: str, db_url: str):
|
||||
5. Save results to database
|
||||
6. Update status to 'completed' or 'failed'
|
||||
"""
|
||||
logger.info(f"Starting background scan execution: scan_id={scan_id}, config={config_file}")
|
||||
logger.info(f"Starting background scan execution: scan_id={scan_id}, config_id={config_id}")
|
||||
|
||||
# Create new database session for this thread
|
||||
engine = create_engine(db_url, echo=False)
|
||||
@@ -60,37 +218,96 @@ def execute_scan(scan_id: int, config_file: str, db_url: str):
|
||||
scan.started_at = datetime.utcnow()
|
||||
session.commit()
|
||||
|
||||
logger.info(f"Scan {scan_id}: Initializing scanner with config {config_file}")
|
||||
logger.info(f"Scan {scan_id}: Initializing scanner with config_id={config_id}")
|
||||
|
||||
# Convert config_file to full path if it's just a filename
|
||||
if not config_file.startswith('/'):
|
||||
config_path = f'/app/configs/{config_file}'
|
||||
else:
|
||||
config_path = config_file
|
||||
# Initialize scanner with database config
|
||||
scanner = SneakyScanner(config_id=config_id)
|
||||
|
||||
# Initialize scanner
|
||||
scanner = SneakyScanner(config_path)
|
||||
# Register scanner in the running registry
|
||||
with _running_scanners_lock:
|
||||
_running_scanners[scan_id] = scanner
|
||||
logger.debug(f"Scan {scan_id}: Registered in running scanners registry")
|
||||
|
||||
# Execute scan
|
||||
# Create progress callback
|
||||
progress_callback = create_progress_callback(scan_id, session)
|
||||
|
||||
# Execute scan with progress tracking
|
||||
logger.info(f"Scan {scan_id}: Running scanner...")
|
||||
start_time = datetime.utcnow()
|
||||
report, timestamp = scanner.scan()
|
||||
report, timestamp = scanner.scan(progress_callback=progress_callback)
|
||||
end_time = datetime.utcnow()
|
||||
|
||||
scan_duration = (end_time - start_time).total_seconds()
|
||||
logger.info(f"Scan {scan_id}: Scanner completed in {scan_duration:.2f} seconds")
|
||||
|
||||
# Generate output files (JSON, HTML, ZIP)
|
||||
logger.info(f"Scan {scan_id}: Generating output files...")
|
||||
scanner.generate_outputs(report, timestamp)
|
||||
# Transition to 'finalizing' status before output generation
|
||||
try:
|
||||
scan = session.query(Scan).filter_by(id=scan_id).first()
|
||||
if scan:
|
||||
scan.status = 'finalizing'
|
||||
scan.current_phase = 'generating_outputs'
|
||||
session.commit()
|
||||
logger.info(f"Scan {scan_id}: Status changed to 'finalizing'")
|
||||
except Exception as e:
|
||||
logger.error(f"Scan {scan_id}: Failed to update status to finalizing: {e}")
|
||||
session.rollback()
|
||||
|
||||
# Save results to database
|
||||
logger.info(f"Scan {scan_id}: Saving results to database...")
|
||||
scan_service = ScanService(session)
|
||||
scan_service._save_scan_to_db(report, scan_id, status='completed')
|
||||
# Generate output files (JSON, HTML, ZIP) with error handling
|
||||
output_paths = {}
|
||||
output_generation_failed = False
|
||||
try:
|
||||
logger.info(f"Scan {scan_id}: Generating output files...")
|
||||
output_paths = scanner.generate_outputs(report, timestamp)
|
||||
except Exception as e:
|
||||
output_generation_failed = True
|
||||
logger.error(f"Scan {scan_id}: Output generation failed: {str(e)}")
|
||||
logger.error(f"Scan {scan_id}: Traceback:\n{traceback.format_exc()}")
|
||||
# Still mark scan as completed with warning since scan data is valid
|
||||
try:
|
||||
scan = session.query(Scan).filter_by(id=scan_id).first()
|
||||
if scan:
|
||||
scan.status = 'completed'
|
||||
scan.error_message = f"Scan completed but output file generation failed: {str(e)}"
|
||||
scan.completed_at = datetime.utcnow()
|
||||
if scan.started_at:
|
||||
scan.duration = (datetime.utcnow() - scan.started_at).total_seconds()
|
||||
session.commit()
|
||||
logger.info(f"Scan {scan_id}: Marked as completed with output generation warning")
|
||||
except Exception as db_error:
|
||||
logger.error(f"Scan {scan_id}: Failed to update status after output error: {db_error}")
|
||||
|
||||
# Save results to database (only if output generation succeeded)
|
||||
if not output_generation_failed:
|
||||
logger.info(f"Scan {scan_id}: Saving results to database...")
|
||||
scan_service = ScanService(session)
|
||||
scan_service._save_scan_to_db(report, scan_id, status='completed', output_paths=output_paths)
|
||||
|
||||
# Evaluate alert rules
|
||||
logger.info(f"Scan {scan_id}: Evaluating alert rules...")
|
||||
try:
|
||||
alert_service = AlertService(session)
|
||||
alerts_triggered = alert_service.evaluate_alert_rules(scan_id)
|
||||
logger.info(f"Scan {scan_id}: {len(alerts_triggered)} alerts triggered")
|
||||
except Exception as e:
|
||||
# Don't fail the scan if alert evaluation fails
|
||||
logger.error(f"Scan {scan_id}: Alert evaluation failed: {str(e)}")
|
||||
logger.debug(f"Alert evaluation error details: {traceback.format_exc()}")
|
||||
|
||||
logger.info(f"Scan {scan_id}: Completed successfully")
|
||||
|
||||
except ScanCancelledError:
|
||||
# Scan was cancelled by user
|
||||
logger.info(f"Scan {scan_id}: Cancelled by user")
|
||||
|
||||
scan = session.query(Scan).filter_by(id=scan_id).first()
|
||||
if scan:
|
||||
scan.status = 'cancelled'
|
||||
scan.error_message = 'Scan cancelled by user'
|
||||
scan.completed_at = datetime.utcnow()
|
||||
if scan.started_at:
|
||||
scan.duration = (datetime.utcnow() - scan.started_at).total_seconds()
|
||||
session.commit()
|
||||
|
||||
except FileNotFoundError as e:
|
||||
# Config file not found
|
||||
error_msg = f"Configuration file not found: {str(e)}"
|
||||
@@ -120,6 +337,12 @@ def execute_scan(scan_id: int, config_file: str, db_url: str):
|
||||
logger.error(f"Scan {scan_id}: Failed to update error status in database: {str(db_error)}")
|
||||
|
||||
finally:
|
||||
# Unregister scanner from registry
|
||||
with _running_scanners_lock:
|
||||
if scan_id in _running_scanners:
|
||||
del _running_scanners[scan_id]
|
||||
logger.debug(f"Scan {scan_id}: Unregistered from running scanners registry")
|
||||
|
||||
# Always close the session
|
||||
session.close()
|
||||
logger.info(f"Scan {scan_id}: Background job completed, session closed")
|
||||
|
||||
59
app/web/jobs/webhook_job.py
Normal file
59
app/web/jobs/webhook_job.py
Normal file
@@ -0,0 +1,59 @@
|
||||
"""
|
||||
Background webhook delivery job execution.
|
||||
|
||||
This module handles the execution of webhook deliveries in background threads,
|
||||
updating delivery logs and handling errors.
|
||||
"""
|
||||
|
||||
import logging
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
from web.services.webhook_service import WebhookService
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def execute_webhook_delivery(webhook_id: int, alert_id: int, db_url: str):
|
||||
"""
|
||||
Execute a webhook delivery in the background.
|
||||
|
||||
This function is designed to run in a background thread via APScheduler.
|
||||
It creates its own database session to avoid conflicts with the main
|
||||
application thread.
|
||||
|
||||
Args:
|
||||
webhook_id: ID of the webhook to deliver
|
||||
alert_id: ID of the alert to send
|
||||
db_url: Database connection URL
|
||||
|
||||
Workflow:
|
||||
1. Create new database session for this thread
|
||||
2. Call WebhookService to deliver webhook
|
||||
3. WebhookService handles retry logic and logging
|
||||
4. Close session
|
||||
"""
|
||||
logger.info(f"Starting background webhook delivery: webhook_id={webhook_id}, alert_id={alert_id}")
|
||||
|
||||
# Create new database session for this thread
|
||||
engine = create_engine(db_url, echo=False)
|
||||
Session = sessionmaker(bind=engine)
|
||||
session = Session()
|
||||
|
||||
try:
|
||||
# Create webhook service and deliver
|
||||
webhook_service = WebhookService(session)
|
||||
success = webhook_service.deliver_webhook(webhook_id, alert_id)
|
||||
|
||||
if success:
|
||||
logger.info(f"Webhook {webhook_id} delivered successfully for alert {alert_id}")
|
||||
else:
|
||||
logger.warning(f"Webhook {webhook_id} delivery failed for alert {alert_id}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error during webhook delivery: {e}", exc_info=True)
|
||||
|
||||
finally:
|
||||
session.close()
|
||||
engine.dispose()
|
||||
logger.info(f"Webhook delivery job completed: webhook_id={webhook_id}, alert_id={alert_id}")
|
||||
@@ -45,8 +45,8 @@ class Scan(Base):
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
timestamp = Column(DateTime, nullable=False, index=True, comment="Scan start time (UTC)")
|
||||
duration = Column(Float, nullable=True, comment="Total scan duration in seconds")
|
||||
status = Column(String(20), nullable=False, default='running', comment="running, completed, failed")
|
||||
config_file = Column(Text, nullable=True, comment="Path to YAML config used")
|
||||
status = Column(String(20), nullable=False, default='running', comment="running, finalizing, completed, failed, cancelled")
|
||||
config_id = Column(Integer, ForeignKey('scan_configs.id'), nullable=True, index=True, comment="FK to scan_configs table")
|
||||
title = Column(Text, nullable=True, comment="Scan title from config")
|
||||
json_path = Column(Text, nullable=True, comment="Path to JSON report")
|
||||
html_path = Column(Text, nullable=True, comment="Path to HTML report")
|
||||
@@ -59,6 +59,11 @@ class Scan(Base):
|
||||
completed_at = Column(DateTime, nullable=True, comment="Scan execution completion time")
|
||||
error_message = Column(Text, nullable=True, comment="Error message if scan failed")
|
||||
|
||||
# Progress tracking fields
|
||||
current_phase = Column(String(50), nullable=True, comment="Current scan phase: ping, tcp_scan, udp_scan, service_detection, http_analysis")
|
||||
total_ips = Column(Integer, nullable=True, comment="Total number of IPs to scan")
|
||||
completed_ips = Column(Integer, nullable=True, default=0, comment="Number of IPs completed in current phase")
|
||||
|
||||
# Relationships
|
||||
sites = relationship('ScanSite', back_populates='scan', cascade='all, delete-orphan')
|
||||
ips = relationship('ScanIP', back_populates='scan', cascade='all, delete-orphan')
|
||||
@@ -68,6 +73,9 @@ class Scan(Base):
|
||||
tls_versions = relationship('ScanTLSVersion', back_populates='scan', cascade='all, delete-orphan')
|
||||
alerts = relationship('Alert', back_populates='scan', cascade='all, delete-orphan')
|
||||
schedule = relationship('Schedule', back_populates='scans')
|
||||
config = relationship('ScanConfig', back_populates='scans')
|
||||
site_associations = relationship('ScanSiteAssociation', back_populates='scan', cascade='all, delete-orphan')
|
||||
progress_entries = relationship('ScanProgress', back_populates='scan', cascade='all, delete-orphan')
|
||||
|
||||
def __repr__(self):
|
||||
return f"<Scan(id={self.id}, title='{self.title}', status='{self.status}')>"
|
||||
@@ -242,6 +250,185 @@ class ScanTLSVersion(Base):
|
||||
return f"<ScanTLSVersion(id={self.id}, tls_version='{self.tls_version}', supported={self.supported})>"
|
||||
|
||||
|
||||
class ScanProgress(Base):
|
||||
"""
|
||||
Real-time progress tracking for individual IPs during scan execution.
|
||||
|
||||
Stores intermediate results as they become available, allowing users to
|
||||
see progress and results before the full scan completes.
|
||||
"""
|
||||
__tablename__ = 'scan_progress'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||
ip_address = Column(String(45), nullable=False, comment="IP address being scanned")
|
||||
site_name = Column(String(255), nullable=True, comment="Site name this IP belongs to")
|
||||
phase = Column(String(50), nullable=False, comment="Phase: ping, tcp_scan, udp_scan, service_detection, http_analysis")
|
||||
status = Column(String(20), nullable=False, default='pending', comment="pending, in_progress, completed, failed")
|
||||
|
||||
# Results data (stored as JSON)
|
||||
ping_result = Column(Boolean, nullable=True, comment="Ping response result")
|
||||
tcp_ports = Column(Text, nullable=True, comment="JSON array of discovered TCP ports")
|
||||
udp_ports = Column(Text, nullable=True, comment="JSON array of discovered UDP ports")
|
||||
services = Column(Text, nullable=True, comment="JSON array of detected services")
|
||||
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Entry creation time")
|
||||
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow, comment="Last update time")
|
||||
|
||||
# Relationships
|
||||
scan = relationship('Scan', back_populates='progress_entries')
|
||||
|
||||
# Index for efficient lookups
|
||||
__table_args__ = (
|
||||
UniqueConstraint('scan_id', 'ip_address', name='uix_scan_progress_ip'),
|
||||
)
|
||||
|
||||
def __repr__(self):
|
||||
return f"<ScanProgress(id={self.id}, ip='{self.ip_address}', phase='{self.phase}', status='{self.status}')>"
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Reusable Site Definition Tables
|
||||
# ============================================================================
|
||||
|
||||
|
||||
class Site(Base):
|
||||
"""
|
||||
Master site definition (reusable across scans).
|
||||
|
||||
Sites represent logical network segments (e.g., "Production DC", "DMZ",
|
||||
"Branch Office") that can be reused across multiple scans. Each site
|
||||
contains one or more CIDR ranges.
|
||||
"""
|
||||
__tablename__ = 'sites'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
name = Column(String(255), nullable=False, unique=True, index=True, comment="Unique site name")
|
||||
description = Column(Text, nullable=True, comment="Site description")
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Site creation time")
|
||||
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow, comment="Last modification time")
|
||||
|
||||
# Relationships
|
||||
ips = relationship('SiteIP', back_populates='site', cascade='all, delete-orphan')
|
||||
scan_associations = relationship('ScanSiteAssociation', back_populates='site')
|
||||
config_associations = relationship('ScanConfigSite', back_populates='site')
|
||||
|
||||
def __repr__(self):
|
||||
return f"<Site(id={self.id}, name='{self.name}')>"
|
||||
|
||||
|
||||
class SiteIP(Base):
|
||||
"""
|
||||
Individual IP addresses with their own settings.
|
||||
|
||||
Each IP is directly associated with a site and has its own port and ping settings.
|
||||
IPs are standalone entities - CIDRs are only used as a convenience for bulk creation.
|
||||
"""
|
||||
__tablename__ = 'site_ips'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
site_id = Column(Integer, ForeignKey('sites.id'), nullable=False, index=True, comment="FK to sites")
|
||||
ip_address = Column(String(45), nullable=False, comment="IPv4 or IPv6 address")
|
||||
expected_ping = Column(Boolean, nullable=True, comment="Expected ping response for this IP")
|
||||
expected_tcp_ports = Column(Text, nullable=True, comment="JSON array of expected TCP ports")
|
||||
expected_udp_ports = Column(Text, nullable=True, comment="JSON array of expected UDP ports")
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="IP creation time")
|
||||
|
||||
# Relationships
|
||||
site = relationship('Site', back_populates='ips')
|
||||
|
||||
# Index for efficient IP lookups - prevent duplicate IPs within a site
|
||||
__table_args__ = (
|
||||
UniqueConstraint('site_id', 'ip_address', name='uix_site_ip_address'),
|
||||
)
|
||||
|
||||
def __repr__(self):
|
||||
return f"<SiteIP(id={self.id}, ip_address='{self.ip_address}')>"
|
||||
|
||||
|
||||
class ScanSiteAssociation(Base):
|
||||
"""
|
||||
Many-to-many relationship between scans and sites.
|
||||
|
||||
Tracks which sites were included in which scans. This allows sites
|
||||
to be reused across multiple scans.
|
||||
"""
|
||||
__tablename__ = 'scan_site_associations'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||
site_id = Column(Integer, ForeignKey('sites.id'), nullable=False, index=True)
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Association creation time")
|
||||
|
||||
# Relationships
|
||||
scan = relationship('Scan', back_populates='site_associations')
|
||||
site = relationship('Site', back_populates='scan_associations')
|
||||
|
||||
# Index to prevent duplicate associations
|
||||
__table_args__ = (
|
||||
UniqueConstraint('scan_id', 'site_id', name='uix_scan_site'),
|
||||
)
|
||||
|
||||
def __repr__(self):
|
||||
return f"<ScanSiteAssociation(scan_id={self.scan_id}, site_id={self.site_id})>"
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Scan Configuration Tables
|
||||
# ============================================================================
|
||||
|
||||
|
||||
class ScanConfig(Base):
|
||||
"""
|
||||
Scan configurations stored in database (replaces YAML files).
|
||||
|
||||
Stores reusable scan configurations that reference sites from the
|
||||
sites table. Configs define what sites to scan together.
|
||||
"""
|
||||
__tablename__ = 'scan_configs'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
title = Column(String(255), nullable=False, comment="Configuration title")
|
||||
description = Column(Text, nullable=True, comment="Configuration description")
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Config creation time")
|
||||
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow, comment="Last modification time")
|
||||
|
||||
# Relationships
|
||||
site_associations = relationship('ScanConfigSite', back_populates='config', cascade='all, delete-orphan')
|
||||
scans = relationship('Scan', back_populates='config')
|
||||
schedules = relationship('Schedule', back_populates='config')
|
||||
|
||||
def __repr__(self):
|
||||
return f"<ScanConfig(id={self.id}, title='{self.title}')>"
|
||||
|
||||
|
||||
class ScanConfigSite(Base):
|
||||
"""
|
||||
Many-to-many relationship between scan configs and sites.
|
||||
|
||||
Links scan configurations to the sites they should scan. A config
|
||||
can reference multiple sites, and sites can be used in multiple configs.
|
||||
"""
|
||||
__tablename__ = 'scan_config_sites'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
config_id = Column(Integer, ForeignKey('scan_configs.id'), nullable=False, index=True)
|
||||
site_id = Column(Integer, ForeignKey('sites.id'), nullable=False, index=True)
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Association creation time")
|
||||
|
||||
# Relationships
|
||||
config = relationship('ScanConfig', back_populates='site_associations')
|
||||
site = relationship('Site', back_populates='config_associations')
|
||||
|
||||
# Index to prevent duplicate associations
|
||||
__table_args__ = (
|
||||
UniqueConstraint('config_id', 'site_id', name='uix_config_site'),
|
||||
)
|
||||
|
||||
def __repr__(self):
|
||||
return f"<ScanConfigSite(config_id={self.config_id}, site_id={self.site_id})>"
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Scheduling & Notifications Tables
|
||||
# ============================================================================
|
||||
@@ -258,7 +445,7 @@ class Schedule(Base):
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
name = Column(String(255), nullable=False, comment="Schedule name (e.g., 'Daily prod scan')")
|
||||
config_file = Column(Text, nullable=False, comment="Path to YAML config")
|
||||
config_id = Column(Integer, ForeignKey('scan_configs.id'), nullable=True, index=True, comment="FK to scan_configs table")
|
||||
cron_expression = Column(String(100), nullable=False, comment="Cron-like schedule (e.g., '0 2 * * *')")
|
||||
enabled = Column(Boolean, nullable=False, default=True, comment="Is schedule active?")
|
||||
last_run = Column(DateTime, nullable=True, comment="Last execution time")
|
||||
@@ -268,6 +455,7 @@ class Schedule(Base):
|
||||
|
||||
# Relationships
|
||||
scans = relationship('Scan', back_populates='schedule')
|
||||
config = relationship('ScanConfig', back_populates='schedules')
|
||||
|
||||
def __repr__(self):
|
||||
return f"<Schedule(id={self.id}, name='{self.name}', enabled={self.enabled})>"
|
||||
@@ -284,17 +472,24 @@ class Alert(Base):
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||
alert_type = Column(String(50), nullable=False, comment="new_port, cert_expiry, service_change, ping_failed")
|
||||
rule_id = Column(Integer, ForeignKey('alert_rules.id'), nullable=True, index=True, comment="Associated alert rule")
|
||||
alert_type = Column(String(50), nullable=False, comment="unexpected_port, drift_detection, cert_expiry, service_change, ping_failed")
|
||||
severity = Column(String(20), nullable=False, comment="info, warning, critical")
|
||||
message = Column(Text, nullable=False, comment="Human-readable alert message")
|
||||
ip_address = Column(String(45), nullable=True, comment="Related IP (optional)")
|
||||
port = Column(Integer, nullable=True, comment="Related port (optional)")
|
||||
email_sent = Column(Boolean, nullable=False, default=False, comment="Was email notification sent?")
|
||||
email_sent_at = Column(DateTime, nullable=True, comment="Email send timestamp")
|
||||
webhook_sent = Column(Boolean, nullable=False, default=False, comment="Was webhook sent?")
|
||||
webhook_sent_at = Column(DateTime, nullable=True, comment="Webhook send timestamp")
|
||||
acknowledged = Column(Boolean, nullable=False, default=False, index=True, comment="Was alert acknowledged?")
|
||||
acknowledged_at = Column(DateTime, nullable=True, comment="Acknowledgment timestamp")
|
||||
acknowledged_by = Column(String(255), nullable=True, comment="User who acknowledged")
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Alert creation time")
|
||||
|
||||
# Relationships
|
||||
scan = relationship('Scan', back_populates='alerts')
|
||||
rule = relationship('AlertRule', back_populates='alerts')
|
||||
|
||||
# Index for alert queries by type and severity
|
||||
__table_args__ = (
|
||||
@@ -315,14 +510,83 @@ class AlertRule(Base):
|
||||
__tablename__ = 'alert_rules'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
rule_type = Column(String(50), nullable=False, comment="unexpected_port, cert_expiry, service_down, etc.")
|
||||
name = Column(String(255), nullable=True, comment="User-friendly rule name")
|
||||
rule_type = Column(String(50), nullable=False, comment="unexpected_port, cert_expiry, service_down, drift_detection, etc.")
|
||||
enabled = Column(Boolean, nullable=False, default=True, comment="Is rule active?")
|
||||
threshold = Column(Integer, nullable=True, comment="Threshold value (e.g., days for cert expiry)")
|
||||
email_enabled = Column(Boolean, nullable=False, default=False, comment="Send email for this rule?")
|
||||
webhook_enabled = Column(Boolean, nullable=False, default=False, comment="Send webhook for this rule?")
|
||||
severity = Column(String(20), nullable=True, comment="Alert severity: critical, warning, info")
|
||||
filter_conditions = Column(Text, nullable=True, comment="JSON filter conditions for the rule")
|
||||
config_id = Column(Integer, ForeignKey('scan_configs.id'), nullable=True, index=True, comment="Optional: specific config this rule applies to")
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Rule creation time")
|
||||
updated_at = Column(DateTime, nullable=True, comment="Last update time")
|
||||
|
||||
# Relationships
|
||||
alerts = relationship("Alert", back_populates="rule", cascade="all, delete-orphan")
|
||||
config = relationship("ScanConfig", backref="alert_rules")
|
||||
|
||||
def __repr__(self):
|
||||
return f"<AlertRule(id={self.id}, rule_type='{self.rule_type}', enabled={self.enabled})>"
|
||||
return f"<AlertRule(id={self.id}, name='{self.name}', rule_type='{self.rule_type}', enabled={self.enabled})>"
|
||||
|
||||
|
||||
class Webhook(Base):
|
||||
"""
|
||||
Webhook configurations for alert notifications.
|
||||
|
||||
Stores webhook endpoints and authentication details for sending alert
|
||||
notifications to external systems.
|
||||
"""
|
||||
__tablename__ = 'webhooks'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
name = Column(String(255), nullable=False, comment="Webhook name")
|
||||
url = Column(Text, nullable=False, comment="Webhook URL")
|
||||
enabled = Column(Boolean, nullable=False, default=True, comment="Is webhook enabled?")
|
||||
auth_type = Column(String(20), nullable=True, comment="Authentication type: none, bearer, basic, custom")
|
||||
auth_token = Column(Text, nullable=True, comment="Encrypted authentication token")
|
||||
custom_headers = Column(Text, nullable=True, comment="JSON custom headers")
|
||||
alert_types = Column(Text, nullable=True, comment="JSON array of alert types to trigger on")
|
||||
severity_filter = Column(Text, nullable=True, comment="JSON array of severities to trigger on")
|
||||
timeout = Column(Integer, nullable=True, default=10, comment="Request timeout in seconds")
|
||||
retry_count = Column(Integer, nullable=True, default=3, comment="Number of retry attempts")
|
||||
template = Column(Text, nullable=True, comment="Jinja2 template for webhook payload")
|
||||
template_format = Column(String(20), nullable=True, default='json', comment="Template output format: json, text")
|
||||
content_type_override = Column(String(100), nullable=True, comment="Optional custom Content-Type header")
|
||||
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Creation time")
|
||||
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Last update time")
|
||||
|
||||
# Relationships
|
||||
delivery_logs = relationship("WebhookDeliveryLog", back_populates="webhook", cascade="all, delete-orphan")
|
||||
|
||||
def __repr__(self):
|
||||
return f"<Webhook(id={self.id}, name='{self.name}', enabled={self.enabled})>"
|
||||
|
||||
|
||||
class WebhookDeliveryLog(Base):
|
||||
"""
|
||||
Webhook delivery tracking.
|
||||
|
||||
Logs all webhook delivery attempts for auditing and debugging purposes.
|
||||
"""
|
||||
__tablename__ = 'webhook_delivery_log'
|
||||
|
||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||
webhook_id = Column(Integer, ForeignKey('webhooks.id'), nullable=False, index=True, comment="Associated webhook")
|
||||
alert_id = Column(Integer, ForeignKey('alerts.id'), nullable=False, index=True, comment="Associated alert")
|
||||
status = Column(String(20), nullable=True, index=True, comment="Delivery status: success, failed, retrying")
|
||||
response_code = Column(Integer, nullable=True, comment="HTTP response code")
|
||||
response_body = Column(Text, nullable=True, comment="Response body from webhook")
|
||||
error_message = Column(Text, nullable=True, comment="Error message if failed")
|
||||
attempt_number = Column(Integer, nullable=True, comment="Which attempt this was")
|
||||
delivered_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Delivery timestamp")
|
||||
|
||||
# Relationships
|
||||
webhook = relationship("Webhook", back_populates="delivery_logs")
|
||||
alert = relationship("Alert")
|
||||
|
||||
def __repr__(self):
|
||||
return f"<WebhookDeliveryLog(id={self.id}, webhook_id={self.webhook_id}, status='{self.status}')>"
|
||||
|
||||
|
||||
# ============================================================================
|
||||
|
||||
@@ -5,8 +5,9 @@ Provides dashboard and scan viewing pages.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
|
||||
from flask import Blueprint, current_app, redirect, render_template, url_for
|
||||
from flask import Blueprint, current_app, redirect, render_template, request, send_from_directory, url_for
|
||||
|
||||
from web.auth.decorators import login_required
|
||||
|
||||
@@ -35,20 +36,7 @@ def dashboard():
|
||||
Returns:
|
||||
Rendered dashboard template
|
||||
"""
|
||||
import os
|
||||
|
||||
# Get list of available config files
|
||||
configs_dir = '/app/configs'
|
||||
config_files = []
|
||||
|
||||
try:
|
||||
if os.path.exists(configs_dir):
|
||||
config_files = [f for f in os.listdir(configs_dir) if f.endswith(('.yaml', '.yml'))]
|
||||
config_files.sort()
|
||||
except Exception as e:
|
||||
logger.error(f"Error listing config files: {e}")
|
||||
|
||||
return render_template('dashboard.html', config_files=config_files)
|
||||
return render_template('dashboard.html')
|
||||
|
||||
|
||||
@bp.route('/scans')
|
||||
@@ -60,20 +48,7 @@ def scans():
|
||||
Returns:
|
||||
Rendered scans list template
|
||||
"""
|
||||
import os
|
||||
|
||||
# Get list of available config files
|
||||
configs_dir = '/app/configs'
|
||||
config_files = []
|
||||
|
||||
try:
|
||||
if os.path.exists(configs_dir):
|
||||
config_files = [f for f in os.listdir(configs_dir) if f.endswith(('.yaml', '.yml'))]
|
||||
config_files.sort()
|
||||
except Exception as e:
|
||||
logger.error(f"Error listing config files: {e}")
|
||||
|
||||
return render_template('scans.html', config_files=config_files)
|
||||
return render_template('scans.html')
|
||||
|
||||
|
||||
@bp.route('/scans/<int:scan_id>')
|
||||
@@ -108,6 +83,19 @@ def compare_scans(scan_id1, scan_id2):
|
||||
return render_template('scan_compare.html', scan_id1=scan_id1, scan_id2=scan_id2)
|
||||
|
||||
|
||||
@bp.route('/search/ip')
|
||||
@login_required
|
||||
def search_ip():
|
||||
"""
|
||||
IP search results page - shows scans containing a specific IP address.
|
||||
|
||||
Returns:
|
||||
Rendered search results template
|
||||
"""
|
||||
ip_address = request.args.get('ip', '').strip()
|
||||
return render_template('ip_search_results.html', ip_address=ip_address)
|
||||
|
||||
|
||||
@bp.route('/schedules')
|
||||
@login_required
|
||||
def schedules():
|
||||
@@ -127,22 +115,19 @@ def create_schedule():
|
||||
Create new schedule form page.
|
||||
|
||||
Returns:
|
||||
Rendered schedule create template with available config files
|
||||
Rendered schedule create template with available configs
|
||||
"""
|
||||
import os
|
||||
from web.models import ScanConfig
|
||||
|
||||
# Get list of available config files
|
||||
configs_dir = '/app/configs'
|
||||
config_files = []
|
||||
# Get list of available configs from database
|
||||
configs = []
|
||||
|
||||
try:
|
||||
if os.path.exists(configs_dir):
|
||||
config_files = [f for f in os.listdir(configs_dir) if f.endswith('.yaml')]
|
||||
config_files.sort()
|
||||
configs = current_app.db_session.query(ScanConfig).order_by(ScanConfig.title).all()
|
||||
except Exception as e:
|
||||
logger.error(f"Error listing config files: {e}")
|
||||
logger.error(f"Error listing configs: {e}")
|
||||
|
||||
return render_template('schedule_create.html', config_files=config_files)
|
||||
return render_template('schedule_create.html', configs=configs)
|
||||
|
||||
|
||||
@bp.route('/schedules/<int:schedule_id>/edit')
|
||||
@@ -157,13 +142,23 @@ def edit_schedule(schedule_id):
|
||||
Returns:
|
||||
Rendered schedule edit template
|
||||
"""
|
||||
from flask import flash
|
||||
|
||||
# Note: Schedule data is loaded via AJAX in the template
|
||||
# This just renders the page with the schedule_id in the URL
|
||||
return render_template('schedule_edit.html', schedule_id=schedule_id)
|
||||
|
||||
|
||||
@bp.route('/sites')
|
||||
@login_required
|
||||
def sites():
|
||||
"""
|
||||
Sites management page - manage reusable site definitions.
|
||||
|
||||
Returns:
|
||||
Rendered sites template
|
||||
"""
|
||||
return render_template('sites.html')
|
||||
|
||||
|
||||
@bp.route('/configs')
|
||||
@login_required
|
||||
def configs():
|
||||
@@ -176,46 +171,118 @@ def configs():
|
||||
return render_template('configs.html')
|
||||
|
||||
|
||||
@bp.route('/configs/upload')
|
||||
@bp.route('/alerts')
|
||||
@login_required
|
||||
def upload_config():
|
||||
def alerts():
|
||||
"""
|
||||
Config upload page - allows CIDR/YAML upload.
|
||||
Alerts history page - shows all alerts.
|
||||
|
||||
Returns:
|
||||
Rendered config upload template
|
||||
Rendered alerts template
|
||||
"""
|
||||
return render_template('config_upload.html')
|
||||
from flask import request, current_app
|
||||
from web.models import Alert, AlertRule, Scan
|
||||
from web.utils.pagination import paginate
|
||||
|
||||
# Get query parameters for filtering
|
||||
page = request.args.get('page', 1, type=int)
|
||||
per_page = 20
|
||||
severity = request.args.get('severity')
|
||||
alert_type = request.args.get('alert_type')
|
||||
acknowledged = request.args.get('acknowledged')
|
||||
|
||||
# Build query
|
||||
query = current_app.db_session.query(Alert).join(Scan, isouter=True)
|
||||
|
||||
# Apply filters
|
||||
if severity:
|
||||
query = query.filter(Alert.severity == severity)
|
||||
if alert_type:
|
||||
query = query.filter(Alert.alert_type == alert_type)
|
||||
if acknowledged is not None:
|
||||
ack_bool = acknowledged == 'true'
|
||||
query = query.filter(Alert.acknowledged == ack_bool)
|
||||
|
||||
# Order by severity and date
|
||||
query = query.order_by(Alert.severity.desc(), Alert.created_at.desc())
|
||||
|
||||
# Paginate using utility function
|
||||
pagination = paginate(query, page=page, per_page=per_page)
|
||||
alerts = pagination.items
|
||||
|
||||
# Get unique alert types for filter dropdown
|
||||
try:
|
||||
alert_types = current_app.db_session.query(Alert.alert_type).distinct().all()
|
||||
alert_types = [at[0] for at in alert_types] if alert_types else []
|
||||
except Exception:
|
||||
alert_types = []
|
||||
|
||||
return render_template(
|
||||
'alerts.html',
|
||||
alerts=alerts,
|
||||
pagination=pagination,
|
||||
current_severity=severity,
|
||||
current_alert_type=alert_type,
|
||||
current_acknowledged=acknowledged,
|
||||
alert_types=alert_types
|
||||
)
|
||||
|
||||
|
||||
@bp.route('/configs/edit/<filename>')
|
||||
@bp.route('/alerts/rules')
|
||||
@login_required
|
||||
def edit_config(filename):
|
||||
def alert_rules():
|
||||
"""
|
||||
Config edit page - allows editing YAML configuration.
|
||||
Alert rules management page.
|
||||
|
||||
Returns:
|
||||
Rendered alert rules template
|
||||
"""
|
||||
from flask import current_app
|
||||
from web.models import AlertRule
|
||||
|
||||
# Get all alert rules with error handling
|
||||
try:
|
||||
rules = current_app.db_session.query(AlertRule).order_by(
|
||||
AlertRule.name.nullslast(),
|
||||
AlertRule.rule_type
|
||||
).all()
|
||||
except Exception as e:
|
||||
logger.error(f"Error fetching alert rules: {e}")
|
||||
rules = []
|
||||
|
||||
# Ensure rules is always a list
|
||||
if rules is None:
|
||||
rules = []
|
||||
|
||||
return render_template(
|
||||
'alert_rules.html',
|
||||
rules=rules
|
||||
)
|
||||
|
||||
|
||||
@bp.route('/help')
|
||||
@login_required
|
||||
def help():
|
||||
"""
|
||||
Help page - explains how to use the application.
|
||||
|
||||
Returns:
|
||||
Rendered help template
|
||||
"""
|
||||
return render_template('help.html')
|
||||
|
||||
|
||||
@bp.route('/output/<path:filename>')
|
||||
@login_required
|
||||
def serve_output_file(filename):
|
||||
"""
|
||||
Serve output files (JSON, HTML, ZIP) from the output directory.
|
||||
|
||||
Args:
|
||||
filename: Config filename to edit
|
||||
filename: Name of the file to serve
|
||||
|
||||
Returns:
|
||||
Rendered config edit template
|
||||
The requested file
|
||||
"""
|
||||
from web.services.config_service import ConfigService
|
||||
from flask import flash, redirect
|
||||
|
||||
try:
|
||||
config_service = ConfigService()
|
||||
config_data = config_service.get_config(filename)
|
||||
|
||||
return render_template(
|
||||
'config_edit.html',
|
||||
filename=config_data['filename'],
|
||||
content=config_data['content']
|
||||
)
|
||||
except FileNotFoundError:
|
||||
flash(f"Config file '{filename}' not found", 'error')
|
||||
return redirect(url_for('main.configs'))
|
||||
except Exception as e:
|
||||
logger.error(f"Error loading config for edit: {e}")
|
||||
flash(f"Error loading config: {str(e)}", 'error')
|
||||
return redirect(url_for('main.configs'))
|
||||
output_dir = os.environ.get('OUTPUT_DIR', '/app/output')
|
||||
return send_from_directory(output_dir, filename)
|
||||
|
||||
83
app/web/routes/webhooks.py
Normal file
83
app/web/routes/webhooks.py
Normal file
@@ -0,0 +1,83 @@
|
||||
"""
|
||||
Webhook web routes for SneakyScanner.
|
||||
|
||||
Provides UI pages for managing webhooks and viewing delivery logs.
|
||||
"""
|
||||
|
||||
import logging
|
||||
|
||||
from flask import Blueprint, render_template, request, redirect, url_for, flash, current_app
|
||||
|
||||
from web.auth.decorators import login_required
|
||||
from web.models import Webhook
|
||||
from web.services.webhook_service import WebhookService
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
bp = Blueprint('webhooks', __name__)
|
||||
|
||||
|
||||
@bp.route('')
|
||||
@login_required
|
||||
def list_webhooks():
|
||||
"""
|
||||
Webhooks list page - shows all configured webhooks.
|
||||
|
||||
Returns:
|
||||
Rendered webhooks list template
|
||||
"""
|
||||
return render_template('webhooks/list.html')
|
||||
|
||||
|
||||
@bp.route('/new')
|
||||
@login_required
|
||||
def new_webhook():
|
||||
"""
|
||||
New webhook form page.
|
||||
|
||||
Returns:
|
||||
Rendered webhook form template
|
||||
"""
|
||||
return render_template('webhooks/form.html', webhook=None, mode='create')
|
||||
|
||||
|
||||
@bp.route('/<int:webhook_id>/edit')
|
||||
@login_required
|
||||
def edit_webhook(webhook_id):
|
||||
"""
|
||||
Edit webhook form page.
|
||||
|
||||
Args:
|
||||
webhook_id: Webhook ID to edit
|
||||
|
||||
Returns:
|
||||
Rendered webhook form template or 404
|
||||
"""
|
||||
webhook = current_app.db_session.query(Webhook).filter(Webhook.id == webhook_id).first()
|
||||
|
||||
if not webhook:
|
||||
flash('Webhook not found', 'error')
|
||||
return redirect(url_for('webhooks.list_webhooks'))
|
||||
|
||||
return render_template('webhooks/form.html', webhook=webhook, mode='edit')
|
||||
|
||||
|
||||
@bp.route('/<int:webhook_id>/logs')
|
||||
@login_required
|
||||
def webhook_logs(webhook_id):
|
||||
"""
|
||||
Webhook delivery logs page.
|
||||
|
||||
Args:
|
||||
webhook_id: Webhook ID
|
||||
|
||||
Returns:
|
||||
Rendered webhook logs template or 404
|
||||
"""
|
||||
webhook = current_app.db_session.query(Webhook).filter(Webhook.id == webhook_id).first()
|
||||
|
||||
if not webhook:
|
||||
flash('Webhook not found', 'error')
|
||||
return redirect(url_for('webhooks.list_webhooks'))
|
||||
|
||||
return render_template('webhooks/logs.html', webhook=webhook)
|
||||
521
app/web/services/alert_service.py
Normal file
521
app/web/services/alert_service.py
Normal file
@@ -0,0 +1,521 @@
|
||||
"""
|
||||
Alert Service Module
|
||||
|
||||
Handles alert evaluation, rule processing, and notification triggering
|
||||
for SneakyScan Phase 5.
|
||||
"""
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import List, Dict, Optional, Any
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from ..models import (
|
||||
Alert, AlertRule, Scan, ScanPort, ScanIP, ScanService as ScanServiceModel,
|
||||
ScanCertificate, ScanTLSVersion
|
||||
)
|
||||
from .scan_service import ScanService
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AlertService:
|
||||
"""
|
||||
Service for evaluating alert rules and generating alerts based on scan results.
|
||||
|
||||
Supports two main alert types:
|
||||
1. Unexpected Port Detection - Alerts when ports marked as unexpected are found open
|
||||
2. Drift Detection - Alerts when scan results differ from previous scan
|
||||
"""
|
||||
|
||||
def __init__(self, db_session: Session):
|
||||
self.db = db_session
|
||||
self.scan_service = ScanService(db_session)
|
||||
|
||||
def evaluate_alert_rules(self, scan_id: int) -> List[Alert]:
|
||||
"""
|
||||
Main entry point for alert evaluation after scan completion.
|
||||
|
||||
Args:
|
||||
scan_id: ID of the completed scan to evaluate
|
||||
|
||||
Returns:
|
||||
List of Alert objects that were created
|
||||
"""
|
||||
logger.info(f"Starting alert evaluation for scan {scan_id}")
|
||||
|
||||
# Get the scan
|
||||
scan = self.db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
if not scan:
|
||||
logger.error(f"Scan {scan_id} not found")
|
||||
return []
|
||||
|
||||
# Get all enabled alert rules
|
||||
rules = self.db.query(AlertRule).filter(AlertRule.enabled == True).all()
|
||||
logger.info(f"Found {len(rules)} enabled alert rules to evaluate")
|
||||
|
||||
alerts_created = []
|
||||
|
||||
for rule in rules:
|
||||
try:
|
||||
# Check if rule applies to this scan's config
|
||||
if rule.config_id and scan.config_id != rule.config_id:
|
||||
logger.debug(f"Skipping rule {rule.id} - config mismatch")
|
||||
continue
|
||||
|
||||
# Evaluate based on rule type
|
||||
alert_data = []
|
||||
|
||||
if rule.rule_type == 'unexpected_port':
|
||||
alert_data = self.check_unexpected_ports(scan, rule)
|
||||
elif rule.rule_type == 'drift_detection':
|
||||
alert_data = self.check_drift_from_previous(scan, rule)
|
||||
elif rule.rule_type == 'cert_expiry':
|
||||
alert_data = self.check_certificate_expiry(scan, rule)
|
||||
elif rule.rule_type == 'weak_tls':
|
||||
alert_data = self.check_weak_tls(scan, rule)
|
||||
elif rule.rule_type == 'ping_failed':
|
||||
alert_data = self.check_ping_failures(scan, rule)
|
||||
else:
|
||||
logger.warning(f"Unknown rule type: {rule.rule_type}")
|
||||
continue
|
||||
|
||||
# Create alerts for any findings
|
||||
for alert_info in alert_data:
|
||||
alert = self.create_alert(scan_id, rule, alert_info)
|
||||
if alert:
|
||||
alerts_created.append(alert)
|
||||
|
||||
# Trigger notifications if configured
|
||||
if rule.email_enabled or rule.webhook_enabled:
|
||||
self.trigger_notifications(alert, rule)
|
||||
|
||||
logger.info(f"Rule {rule.name or rule.id} generated {len(alert_data)} alerts")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error evaluating rule {rule.id}: {str(e)}")
|
||||
continue
|
||||
|
||||
logger.info(f"Alert evaluation complete. Created {len(alerts_created)} alerts")
|
||||
return alerts_created
|
||||
|
||||
def check_unexpected_ports(self, scan: Scan, rule: AlertRule) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Detect ports that are open but not in the expected_ports list.
|
||||
|
||||
Args:
|
||||
scan: The scan to check
|
||||
rule: The alert rule configuration
|
||||
|
||||
Returns:
|
||||
List of alert data dictionaries
|
||||
"""
|
||||
alerts_to_create = []
|
||||
|
||||
# Get all ports where expected=False
|
||||
unexpected_ports = (
|
||||
self.db.query(ScanPort, ScanIP)
|
||||
.join(ScanIP, ScanPort.ip_id == ScanIP.id)
|
||||
.filter(ScanPort.scan_id == scan.id)
|
||||
.filter(ScanPort.expected == False) # Not in config's expected_ports
|
||||
.filter(ScanPort.state == 'open')
|
||||
.all()
|
||||
)
|
||||
|
||||
# High-risk ports that should trigger critical alerts
|
||||
high_risk_ports = {
|
||||
22, # SSH
|
||||
23, # Telnet
|
||||
135, # Windows RPC
|
||||
139, # NetBIOS
|
||||
445, # SMB
|
||||
1433, # SQL Server
|
||||
3306, # MySQL
|
||||
3389, # RDP
|
||||
5432, # PostgreSQL
|
||||
5900, # VNC
|
||||
6379, # Redis
|
||||
9200, # Elasticsearch
|
||||
27017, # MongoDB
|
||||
}
|
||||
|
||||
for port, ip in unexpected_ports:
|
||||
# Determine severity based on port number
|
||||
severity = rule.severity or ('critical' if port.port in high_risk_ports else 'warning')
|
||||
|
||||
# Get service info if available
|
||||
service = (
|
||||
self.db.query(ScanServiceModel)
|
||||
.filter(ScanServiceModel.port_id == port.id)
|
||||
.first()
|
||||
)
|
||||
|
||||
service_info = ""
|
||||
if service:
|
||||
product = service.product or "Unknown"
|
||||
version = service.version or ""
|
||||
service_info = f" (Service: {service.service_name}: {product} {version}".strip() + ")"
|
||||
|
||||
alerts_to_create.append({
|
||||
'alert_type': 'unexpected_port',
|
||||
'severity': severity,
|
||||
'message': f"Unexpected port open on {ip.ip_address}:{port.port}/{port.protocol}{service_info}",
|
||||
'ip_address': ip.ip_address,
|
||||
'port': port.port
|
||||
})
|
||||
|
||||
return alerts_to_create
|
||||
|
||||
def check_drift_from_previous(self, scan: Scan, rule: AlertRule) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Compare current scan to the last scan with the same config.
|
||||
|
||||
Args:
|
||||
scan: The current scan
|
||||
rule: The alert rule configuration
|
||||
|
||||
Returns:
|
||||
List of alert data dictionaries
|
||||
"""
|
||||
alerts_to_create = []
|
||||
|
||||
# Find previous scan with same config_id
|
||||
previous_scan = (
|
||||
self.db.query(Scan)
|
||||
.filter(Scan.config_id == scan.config_id)
|
||||
.filter(Scan.id < scan.id)
|
||||
.filter(Scan.status == 'completed')
|
||||
.order_by(Scan.started_at.desc() if Scan.started_at else Scan.timestamp.desc())
|
||||
.first()
|
||||
)
|
||||
|
||||
if not previous_scan:
|
||||
logger.info(f"No previous scan found for config_id {scan.config_id}")
|
||||
return []
|
||||
|
||||
try:
|
||||
# Use existing comparison logic from scan_service
|
||||
comparison = self.scan_service.compare_scans(previous_scan.id, scan.id)
|
||||
|
||||
# Alert on new ports
|
||||
for port_data in comparison.get('ports', {}).get('added', []):
|
||||
severity = rule.severity or 'warning'
|
||||
alerts_to_create.append({
|
||||
'alert_type': 'drift_new_port',
|
||||
'severity': severity,
|
||||
'message': f"New port detected: {port_data['ip']}:{port_data['port']}/{port_data['protocol']}",
|
||||
'ip_address': port_data['ip'],
|
||||
'port': port_data['port']
|
||||
})
|
||||
|
||||
# Alert on removed ports
|
||||
for port_data in comparison.get('ports', {}).get('removed', []):
|
||||
severity = rule.severity or 'info'
|
||||
alerts_to_create.append({
|
||||
'alert_type': 'drift_missing_port',
|
||||
'severity': severity,
|
||||
'message': f"Port no longer open: {port_data['ip']}:{port_data['port']}/{port_data['protocol']}",
|
||||
'ip_address': port_data['ip'],
|
||||
'port': port_data['port']
|
||||
})
|
||||
|
||||
# Alert on service changes
|
||||
for svc_data in comparison.get('services', {}).get('changed', []):
|
||||
old_svc = svc_data.get('old', {})
|
||||
new_svc = svc_data.get('new', {})
|
||||
|
||||
old_desc = f"{old_svc.get('product', 'Unknown')} {old_svc.get('version', '')}".strip()
|
||||
new_desc = f"{new_svc.get('product', 'Unknown')} {new_svc.get('version', '')}".strip()
|
||||
|
||||
severity = rule.severity or 'info'
|
||||
alerts_to_create.append({
|
||||
'alert_type': 'drift_service_change',
|
||||
'severity': severity,
|
||||
'message': f"Service changed on {svc_data['ip']}:{svc_data['port']}: {old_desc} → {new_desc}",
|
||||
'ip_address': svc_data['ip'],
|
||||
'port': svc_data['port']
|
||||
})
|
||||
|
||||
# Alert on certificate changes
|
||||
for cert_data in comparison.get('certificates', {}).get('changed', []):
|
||||
old_cert = cert_data.get('old', {})
|
||||
new_cert = cert_data.get('new', {})
|
||||
|
||||
severity = rule.severity or 'warning'
|
||||
alerts_to_create.append({
|
||||
'alert_type': 'drift_cert_change',
|
||||
'severity': severity,
|
||||
'message': f"Certificate changed on {cert_data['ip']}:{cert_data['port']} - "
|
||||
f"Subject: {old_cert.get('subject', 'Unknown')} → {new_cert.get('subject', 'Unknown')}",
|
||||
'ip_address': cert_data['ip'],
|
||||
'port': cert_data['port']
|
||||
})
|
||||
|
||||
# Check drift score threshold if configured
|
||||
if rule.threshold and comparison.get('drift_score', 0) * 100 >= rule.threshold:
|
||||
alerts_to_create.append({
|
||||
'alert_type': 'drift_threshold_exceeded',
|
||||
'severity': rule.severity or 'warning',
|
||||
'message': f"Drift score {comparison['drift_score']*100:.1f}% exceeds threshold {rule.threshold}%",
|
||||
'ip_address': None,
|
||||
'port': None
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error comparing scans: {str(e)}")
|
||||
|
||||
return alerts_to_create
|
||||
|
||||
def check_certificate_expiry(self, scan: Scan, rule: AlertRule) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Check for certificates expiring within the threshold days.
|
||||
|
||||
Args:
|
||||
scan: The scan to check
|
||||
rule: The alert rule configuration
|
||||
|
||||
Returns:
|
||||
List of alert data dictionaries
|
||||
"""
|
||||
alerts_to_create = []
|
||||
threshold_days = rule.threshold or 30 # Default 30 days
|
||||
|
||||
# Get all certificates from the scan
|
||||
certificates = (
|
||||
self.db.query(ScanCertificate, ScanPort, ScanIP)
|
||||
.join(ScanServiceModel, ScanCertificate.service_id == ScanServiceModel.id)
|
||||
.join(ScanPort, ScanServiceModel.port_id == ScanPort.id)
|
||||
.join(ScanIP, ScanPort.ip_id == ScanIP.id)
|
||||
.filter(ScanPort.scan_id == scan.id)
|
||||
.all()
|
||||
)
|
||||
|
||||
for cert, port, ip in certificates:
|
||||
if cert.days_until_expiry is not None and cert.days_until_expiry <= threshold_days:
|
||||
if cert.days_until_expiry <= 0:
|
||||
severity = 'critical'
|
||||
message = f"Certificate EXPIRED on {ip.ip_address}:{port.port}"
|
||||
elif cert.days_until_expiry <= 7:
|
||||
severity = 'critical'
|
||||
message = f"Certificate expires in {cert.days_until_expiry} days on {ip.ip_address}:{port.port}"
|
||||
elif cert.days_until_expiry <= 14:
|
||||
severity = 'warning'
|
||||
message = f"Certificate expires in {cert.days_until_expiry} days on {ip.ip_address}:{port.port}"
|
||||
else:
|
||||
severity = 'info'
|
||||
message = f"Certificate expires in {cert.days_until_expiry} days on {ip.ip_address}:{port.port}"
|
||||
|
||||
alerts_to_create.append({
|
||||
'alert_type': 'cert_expiry',
|
||||
'severity': severity,
|
||||
'message': message,
|
||||
'ip_address': ip.ip_address,
|
||||
'port': port.port
|
||||
})
|
||||
|
||||
return alerts_to_create
|
||||
|
||||
def check_weak_tls(self, scan: Scan, rule: AlertRule) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Check for weak TLS versions (1.0, 1.1).
|
||||
|
||||
Args:
|
||||
scan: The scan to check
|
||||
rule: The alert rule configuration
|
||||
|
||||
Returns:
|
||||
List of alert data dictionaries
|
||||
"""
|
||||
alerts_to_create = []
|
||||
|
||||
# Get all TLS version data from the scan
|
||||
tls_versions = (
|
||||
self.db.query(ScanTLSVersion, ScanPort, ScanIP)
|
||||
.join(ScanCertificate, ScanTLSVersion.certificate_id == ScanCertificate.id)
|
||||
.join(ScanServiceModel, ScanCertificate.service_id == ScanServiceModel.id)
|
||||
.join(ScanPort, ScanServiceModel.port_id == ScanPort.id)
|
||||
.join(ScanIP, ScanPort.ip_id == ScanIP.id)
|
||||
.filter(ScanPort.scan_id == scan.id)
|
||||
.all()
|
||||
)
|
||||
|
||||
# Group TLS versions by port/IP to create one alert per host
|
||||
tls_by_host = {}
|
||||
for tls, port, ip in tls_versions:
|
||||
# Only alert on weak TLS versions that are supported
|
||||
if tls.supported and tls.tls_version in ['TLS 1.0', 'TLS 1.1']:
|
||||
key = (ip.ip_address, port.port)
|
||||
if key not in tls_by_host:
|
||||
tls_by_host[key] = {'ip': ip.ip_address, 'port': port.port, 'versions': []}
|
||||
tls_by_host[key]['versions'].append(tls.tls_version)
|
||||
|
||||
# Create alerts for hosts with weak TLS
|
||||
for host_key, host_data in tls_by_host.items():
|
||||
severity = rule.severity or 'warning'
|
||||
alerts_to_create.append({
|
||||
'alert_type': 'weak_tls',
|
||||
'severity': severity,
|
||||
'message': f"Weak TLS versions supported on {host_data['ip']}:{host_data['port']}: {', '.join(host_data['versions'])}",
|
||||
'ip_address': host_data['ip'],
|
||||
'port': host_data['port']
|
||||
})
|
||||
|
||||
return alerts_to_create
|
||||
|
||||
def check_ping_failures(self, scan: Scan, rule: AlertRule) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Check for hosts that were expected to respond to ping but didn't.
|
||||
|
||||
Args:
|
||||
scan: The scan to check
|
||||
rule: The alert rule configuration
|
||||
|
||||
Returns:
|
||||
List of alert data dictionaries
|
||||
"""
|
||||
alerts_to_create = []
|
||||
|
||||
# Get all IPs where ping was expected but failed
|
||||
failed_pings = (
|
||||
self.db.query(ScanIP)
|
||||
.filter(ScanIP.scan_id == scan.id)
|
||||
.filter(ScanIP.ping_expected == True)
|
||||
.filter(ScanIP.ping_actual == False)
|
||||
.all()
|
||||
)
|
||||
|
||||
for ip in failed_pings:
|
||||
severity = rule.severity or 'warning'
|
||||
alerts_to_create.append({
|
||||
'alert_type': 'ping_failed',
|
||||
'severity': severity,
|
||||
'message': f"Host {ip.ip_address} did not respond to ping (expected to be up)",
|
||||
'ip_address': ip.ip_address,
|
||||
'port': None
|
||||
})
|
||||
|
||||
return alerts_to_create
|
||||
|
||||
def create_alert(self, scan_id: int, rule: AlertRule, alert_data: Dict[str, Any]) -> Optional[Alert]:
|
||||
"""
|
||||
Create an alert record in the database.
|
||||
|
||||
Args:
|
||||
scan_id: ID of the scan that triggered the alert
|
||||
rule: The alert rule that was triggered
|
||||
alert_data: Dictionary with alert details
|
||||
|
||||
Returns:
|
||||
Created Alert object or None if creation failed
|
||||
"""
|
||||
try:
|
||||
alert = Alert(
|
||||
scan_id=scan_id,
|
||||
rule_id=rule.id,
|
||||
alert_type=alert_data['alert_type'],
|
||||
severity=alert_data['severity'],
|
||||
message=alert_data['message'],
|
||||
ip_address=alert_data.get('ip_address'),
|
||||
port=alert_data.get('port'),
|
||||
created_at=datetime.now(timezone.utc)
|
||||
)
|
||||
|
||||
self.db.add(alert)
|
||||
self.db.commit()
|
||||
|
||||
logger.info(f"Created alert: {alert.message}")
|
||||
return alert
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create alert: {str(e)}")
|
||||
self.db.rollback()
|
||||
return None
|
||||
|
||||
def trigger_notifications(self, alert: Alert, rule: AlertRule):
|
||||
"""
|
||||
Send notifications for an alert based on rule configuration.
|
||||
|
||||
Args:
|
||||
alert: The alert to send notifications for
|
||||
rule: The rule that specifies notification settings
|
||||
"""
|
||||
# Email notification will be implemented in email_service.py
|
||||
if rule.email_enabled:
|
||||
logger.info(f"Email notification would be sent for alert {alert.id}")
|
||||
# TODO: Call email service
|
||||
|
||||
# Webhook notification - queue for delivery
|
||||
if rule.webhook_enabled:
|
||||
try:
|
||||
from flask import current_app
|
||||
from .webhook_service import WebhookService
|
||||
|
||||
webhook_service = WebhookService(self.db)
|
||||
|
||||
# Get matching webhooks for this alert
|
||||
matching_webhooks = webhook_service.get_matching_webhooks(alert)
|
||||
|
||||
if matching_webhooks:
|
||||
# Get scheduler from app context
|
||||
scheduler = getattr(current_app, 'scheduler', None)
|
||||
|
||||
# Queue delivery for each matching webhook
|
||||
for webhook in matching_webhooks:
|
||||
webhook_service.queue_webhook_delivery(
|
||||
webhook.id,
|
||||
alert.id,
|
||||
scheduler_service=scheduler
|
||||
)
|
||||
logger.info(f"Queued webhook {webhook.id} ({webhook.name}) for alert {alert.id}")
|
||||
else:
|
||||
logger.debug(f"No matching webhooks found for alert {alert.id}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to queue webhook notifications for alert {alert.id}: {e}", exc_info=True)
|
||||
# Don't fail alert creation if webhook queueing fails
|
||||
|
||||
def acknowledge_alert(self, alert_id: int, acknowledged_by: str = "system") -> bool:
|
||||
"""
|
||||
Acknowledge an alert.
|
||||
|
||||
Args:
|
||||
alert_id: ID of the alert to acknowledge
|
||||
acknowledged_by: Username or system identifier
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
try:
|
||||
alert = self.db.query(Alert).filter(Alert.id == alert_id).first()
|
||||
if not alert:
|
||||
logger.error(f"Alert {alert_id} not found")
|
||||
return False
|
||||
|
||||
alert.acknowledged = True
|
||||
alert.acknowledged_at = datetime.now(timezone.utc)
|
||||
alert.acknowledged_by = acknowledged_by
|
||||
|
||||
self.db.commit()
|
||||
logger.info(f"Alert {alert_id} acknowledged by {acknowledged_by}")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to acknowledge alert {alert_id}: {str(e)}")
|
||||
self.db.rollback()
|
||||
return False
|
||||
|
||||
def get_alerts_for_scan(self, scan_id: int) -> List[Alert]:
|
||||
"""
|
||||
Get all alerts for a specific scan.
|
||||
|
||||
Args:
|
||||
scan_id: ID of the scan
|
||||
|
||||
Returns:
|
||||
List of Alert objects
|
||||
"""
|
||||
return (
|
||||
self.db.query(Alert)
|
||||
.filter(Alert.scan_id == scan_id)
|
||||
.order_by(Alert.severity.desc(), Alert.created_at.desc())
|
||||
.all()
|
||||
)
|
||||
@@ -1,552 +1,339 @@
|
||||
"""
|
||||
Config Service - Business logic for config file management
|
||||
Config Service - Business logic for config management
|
||||
|
||||
This service handles all operations related to scan configuration files,
|
||||
including creation, validation, listing, and deletion.
|
||||
This service handles all operations related to scan configurations,
|
||||
both database-stored (primary) and file-based (deprecated).
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import yaml
|
||||
import ipaddress
|
||||
from typing import Dict, List, Tuple, Any, Optional
|
||||
from typing import Dict, List, Any, Optional
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
from werkzeug.utils import secure_filename
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
|
||||
class ConfigService:
|
||||
"""Business logic for config management"""
|
||||
|
||||
def __init__(self, configs_dir: str = '/app/configs'):
|
||||
def __init__(self, db_session: Session = None, configs_dir: str = '/app/configs'):
|
||||
"""
|
||||
Initialize the config service.
|
||||
|
||||
Args:
|
||||
configs_dir: Directory where config files are stored
|
||||
db_session: SQLAlchemy database session (for database operations)
|
||||
configs_dir: Directory where legacy config files are stored
|
||||
"""
|
||||
self.db = db_session
|
||||
self.configs_dir = configs_dir
|
||||
|
||||
# Ensure configs directory exists
|
||||
# Ensure configs directory exists (for legacy YAML configs)
|
||||
os.makedirs(self.configs_dir, exist_ok=True)
|
||||
|
||||
def list_configs(self) -> List[Dict[str, Any]]:
|
||||
# ============================================================================
|
||||
# Database-based Config Operations (Primary)
|
||||
# ============================================================================
|
||||
|
||||
def create_config(self, title: str, description: Optional[str], site_ids: List[int]) -> Dict[str, Any]:
|
||||
"""
|
||||
List all config files with metadata.
|
||||
|
||||
Returns:
|
||||
List of config metadata dictionaries:
|
||||
[
|
||||
{
|
||||
"filename": "prod-scan.yaml",
|
||||
"title": "Prod Scan",
|
||||
"path": "/app/configs/prod-scan.yaml",
|
||||
"created_at": "2025-11-15T10:30:00Z",
|
||||
"size_bytes": 1234,
|
||||
"used_by_schedules": ["Daily Scan", "Weekly Audit"]
|
||||
}
|
||||
]
|
||||
"""
|
||||
configs = []
|
||||
|
||||
# Get all YAML files in configs directory
|
||||
if not os.path.exists(self.configs_dir):
|
||||
return configs
|
||||
|
||||
for filename in os.listdir(self.configs_dir):
|
||||
if not filename.endswith(('.yaml', '.yml')):
|
||||
continue
|
||||
|
||||
filepath = os.path.join(self.configs_dir, filename)
|
||||
|
||||
if not os.path.isfile(filepath):
|
||||
continue
|
||||
|
||||
try:
|
||||
# Get file metadata
|
||||
stat_info = os.stat(filepath)
|
||||
created_at = datetime.fromtimestamp(stat_info.st_mtime).isoformat() + 'Z'
|
||||
size_bytes = stat_info.st_size
|
||||
|
||||
# Parse YAML to get title
|
||||
title = None
|
||||
try:
|
||||
with open(filepath, 'r') as f:
|
||||
data = yaml.safe_load(f)
|
||||
if isinstance(data, dict):
|
||||
title = data.get('title', filename)
|
||||
except Exception:
|
||||
title = filename # Fallback to filename if parsing fails
|
||||
|
||||
# Get schedules using this config
|
||||
used_by_schedules = self.get_schedules_using_config(filename)
|
||||
|
||||
configs.append({
|
||||
'filename': filename,
|
||||
'title': title,
|
||||
'path': filepath,
|
||||
'created_at': created_at,
|
||||
'size_bytes': size_bytes,
|
||||
'used_by_schedules': used_by_schedules
|
||||
})
|
||||
except Exception as e:
|
||||
# Skip files that can't be read
|
||||
continue
|
||||
|
||||
# Sort by created_at (most recent first)
|
||||
configs.sort(key=lambda x: x['created_at'], reverse=True)
|
||||
|
||||
return configs
|
||||
|
||||
def get_config(self, filename: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get config file content and parsed data.
|
||||
Create a new scan configuration in the database.
|
||||
|
||||
Args:
|
||||
filename: Config filename
|
||||
title: Configuration title
|
||||
description: Optional configuration description
|
||||
site_ids: List of site IDs to include in this config
|
||||
|
||||
Returns:
|
||||
Created config as dictionary:
|
||||
{
|
||||
"filename": "prod-scan.yaml",
|
||||
"content": "title: Prod Scan\n...",
|
||||
"parsed": {"title": "Prod Scan", "sites": [...]}
|
||||
"id": 1,
|
||||
"title": "Production Scan",
|
||||
"description": "...",
|
||||
"site_count": 3,
|
||||
"sites": [...],
|
||||
"created_at": "2025-11-19T10:30:00Z",
|
||||
"updated_at": "2025-11-19T10:30:00Z"
|
||||
}
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If config doesn't exist
|
||||
ValueError: If config content is invalid
|
||||
ValueError: If validation fails or sites don't exist
|
||||
"""
|
||||
filepath = os.path.join(self.configs_dir, filename)
|
||||
if not title or not title.strip():
|
||||
raise ValueError("Title is required")
|
||||
|
||||
if not os.path.exists(filepath):
|
||||
raise FileNotFoundError(f"Config file '{filename}' not found")
|
||||
if not site_ids or len(site_ids) == 0:
|
||||
raise ValueError("At least one site must be selected")
|
||||
|
||||
# Read file content
|
||||
with open(filepath, 'r') as f:
|
||||
content = f.read()
|
||||
# Import models here to avoid circular imports
|
||||
from web.models import ScanConfig, ScanConfigSite, Site
|
||||
|
||||
# Parse YAML
|
||||
try:
|
||||
parsed = yaml.safe_load(content)
|
||||
except yaml.YAMLError as e:
|
||||
raise ValueError(f"Invalid YAML syntax: {str(e)}")
|
||||
# Verify all sites exist
|
||||
existing_sites = self.db.query(Site).filter(Site.id.in_(site_ids)).all()
|
||||
if len(existing_sites) != len(site_ids):
|
||||
found_ids = {s.id for s in existing_sites}
|
||||
missing_ids = set(site_ids) - found_ids
|
||||
raise ValueError(f"Sites not found: {missing_ids}")
|
||||
|
||||
return {
|
||||
'filename': filename,
|
||||
'content': content,
|
||||
'parsed': parsed
|
||||
}
|
||||
# Create config
|
||||
config = ScanConfig(
|
||||
title=title.strip(),
|
||||
description=description.strip() if description else None,
|
||||
created_at=datetime.utcnow(),
|
||||
updated_at=datetime.utcnow()
|
||||
)
|
||||
|
||||
def create_from_yaml(self, filename: str, content: str) -> str:
|
||||
self.db.add(config)
|
||||
self.db.flush() # Get the config ID
|
||||
|
||||
# Create associations
|
||||
for site_id in site_ids:
|
||||
assoc = ScanConfigSite(
|
||||
config_id=config.id,
|
||||
site_id=site_id,
|
||||
created_at=datetime.utcnow()
|
||||
)
|
||||
self.db.add(assoc)
|
||||
|
||||
self.db.commit()
|
||||
|
||||
return self.get_config_by_id(config.id)
|
||||
|
||||
def get_config_by_id(self, config_id: int) -> Dict[str, Any]:
|
||||
"""
|
||||
Create config from YAML content.
|
||||
Get a scan configuration by ID.
|
||||
|
||||
Args:
|
||||
filename: Desired filename (will be sanitized)
|
||||
content: YAML content string
|
||||
config_id: Configuration ID
|
||||
|
||||
Returns:
|
||||
Final filename (sanitized)
|
||||
Config as dictionary with sites
|
||||
|
||||
Raises:
|
||||
ValueError: If content invalid or filename conflict
|
||||
ValueError: If config not found
|
||||
"""
|
||||
# Sanitize filename
|
||||
filename = secure_filename(filename)
|
||||
from web.models import ScanConfig
|
||||
|
||||
# Ensure .yaml extension
|
||||
if not filename.endswith(('.yaml', '.yml')):
|
||||
filename += '.yaml'
|
||||
config = self.db.query(ScanConfig).filter_by(id=config_id).first()
|
||||
|
||||
filepath = os.path.join(self.configs_dir, filename)
|
||||
if not config:
|
||||
raise ValueError(f"Config with ID {config_id} not found")
|
||||
|
||||
# Check for conflicts
|
||||
if os.path.exists(filepath):
|
||||
raise ValueError(f"Config file '{filename}' already exists")
|
||||
|
||||
# Parse and validate YAML
|
||||
try:
|
||||
parsed = yaml.safe_load(content)
|
||||
except yaml.YAMLError as e:
|
||||
raise ValueError(f"Invalid YAML syntax: {str(e)}")
|
||||
|
||||
# Validate config structure
|
||||
is_valid, error_msg = self.validate_config_content(parsed)
|
||||
if not is_valid:
|
||||
raise ValueError(f"Invalid config structure: {error_msg}")
|
||||
|
||||
# Write file
|
||||
with open(filepath, 'w') as f:
|
||||
f.write(content)
|
||||
|
||||
return filename
|
||||
|
||||
def create_from_cidr(
|
||||
self,
|
||||
title: str,
|
||||
cidr: str,
|
||||
site_name: Optional[str] = None,
|
||||
ping_default: bool = False
|
||||
) -> Tuple[str, str]:
|
||||
"""
|
||||
Create config from CIDR range.
|
||||
|
||||
Args:
|
||||
title: Scan configuration title
|
||||
cidr: CIDR range (e.g., "10.0.0.0/24")
|
||||
site_name: Optional site name (defaults to "Site 1")
|
||||
ping_default: Default ping expectation for all IPs
|
||||
|
||||
Returns:
|
||||
Tuple of (final_filename, yaml_content)
|
||||
|
||||
Raises:
|
||||
ValueError: If CIDR invalid or other validation errors
|
||||
"""
|
||||
# Validate and parse CIDR
|
||||
try:
|
||||
network = ipaddress.ip_network(cidr, strict=False)
|
||||
except ValueError as e:
|
||||
raise ValueError(f"Invalid CIDR range: {str(e)}")
|
||||
|
||||
# Check if network is too large (prevent expansion of huge ranges)
|
||||
if network.num_addresses > 10000:
|
||||
raise ValueError(f"CIDR range too large: {network.num_addresses} addresses. Maximum is 10,000.")
|
||||
|
||||
# Expand CIDR to list of IP addresses
|
||||
ip_list = [str(ip) for ip in network.hosts()]
|
||||
|
||||
# If network has only 1 address (like /32 or /128), hosts() returns empty
|
||||
# In that case, use the network address itself
|
||||
if not ip_list:
|
||||
ip_list = [str(network.network_address)]
|
||||
|
||||
# Build site name
|
||||
if not site_name or not site_name.strip():
|
||||
site_name = "Site 1"
|
||||
|
||||
# Build IP configurations
|
||||
ips = []
|
||||
for ip_address in ip_list:
|
||||
ips.append({
|
||||
'address': ip_address,
|
||||
'expected': {
|
||||
'ping': ping_default,
|
||||
'tcp_ports': [],
|
||||
'udp_ports': []
|
||||
}
|
||||
# Get associated sites
|
||||
sites = []
|
||||
for assoc in config.site_associations:
|
||||
site = assoc.site
|
||||
sites.append({
|
||||
'id': site.id,
|
||||
'name': site.name,
|
||||
'description': site.description,
|
||||
'ip_count': len(site.ips)
|
||||
})
|
||||
|
||||
# Build YAML structure
|
||||
config_data = {
|
||||
'title': title.strip(),
|
||||
'sites': [
|
||||
{
|
||||
'name': site_name.strip(),
|
||||
'ips': ips
|
||||
}
|
||||
]
|
||||
return {
|
||||
'id': config.id,
|
||||
'title': config.title,
|
||||
'description': config.description,
|
||||
'site_count': len(sites),
|
||||
'sites': sites,
|
||||
'created_at': config.created_at.isoformat() + 'Z' if config.created_at else None,
|
||||
'updated_at': config.updated_at.isoformat() + 'Z' if config.updated_at else None
|
||||
}
|
||||
|
||||
# Convert to YAML string
|
||||
yaml_content = yaml.dump(config_data, sort_keys=False, default_flow_style=False)
|
||||
|
||||
# Generate filename from title
|
||||
filename = self.generate_filename_from_title(title)
|
||||
|
||||
filepath = os.path.join(self.configs_dir, filename)
|
||||
|
||||
# Check for conflicts
|
||||
if os.path.exists(filepath):
|
||||
raise ValueError(f"Config file '{filename}' already exists")
|
||||
|
||||
# Write file
|
||||
with open(filepath, 'w') as f:
|
||||
f.write(yaml_content)
|
||||
|
||||
return filename, yaml_content
|
||||
|
||||
def update_config(self, filename: str, yaml_content: str) -> None:
|
||||
def list_configs_db(self) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Update existing config file with new YAML content.
|
||||
List all scan configurations from database.
|
||||
|
||||
Returns:
|
||||
List of config dictionaries with metadata
|
||||
"""
|
||||
from web.models import ScanConfig
|
||||
|
||||
configs = self.db.query(ScanConfig).order_by(ScanConfig.updated_at.desc()).all()
|
||||
|
||||
result = []
|
||||
for config in configs:
|
||||
sites = []
|
||||
for assoc in config.site_associations:
|
||||
site = assoc.site
|
||||
sites.append({
|
||||
'id': site.id,
|
||||
'name': site.name
|
||||
})
|
||||
|
||||
result.append({
|
||||
'id': config.id,
|
||||
'title': config.title,
|
||||
'description': config.description,
|
||||
'site_count': len(sites),
|
||||
'sites': sites,
|
||||
'created_at': config.created_at.isoformat() + 'Z' if config.created_at else None,
|
||||
'updated_at': config.updated_at.isoformat() + 'Z' if config.updated_at else None
|
||||
})
|
||||
|
||||
return result
|
||||
|
||||
def update_config(self, config_id: int, title: Optional[str], description: Optional[str], site_ids: Optional[List[int]]) -> Dict[str, Any]:
|
||||
"""
|
||||
Update a scan configuration.
|
||||
|
||||
Args:
|
||||
filename: Config filename to update
|
||||
yaml_content: New YAML content string
|
||||
config_id: Configuration ID to update
|
||||
title: New title (optional)
|
||||
description: New description (optional)
|
||||
site_ids: New list of site IDs (optional, replaces existing)
|
||||
|
||||
Returns:
|
||||
Updated config dictionary
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If config doesn't exist
|
||||
ValueError: If YAML content is invalid
|
||||
ValueError: If config not found or validation fails
|
||||
"""
|
||||
filepath = os.path.join(self.configs_dir, filename)
|
||||
from web.models import ScanConfig, ScanConfigSite, Site
|
||||
|
||||
# Check if file exists
|
||||
if not os.path.exists(filepath):
|
||||
raise FileNotFoundError(f"Config file '{filename}' not found")
|
||||
config = self.db.query(ScanConfig).filter_by(id=config_id).first()
|
||||
|
||||
# Parse and validate YAML
|
||||
try:
|
||||
parsed = yaml.safe_load(yaml_content)
|
||||
except yaml.YAMLError as e:
|
||||
raise ValueError(f"Invalid YAML syntax: {str(e)}")
|
||||
if not config:
|
||||
raise ValueError(f"Config with ID {config_id} not found")
|
||||
|
||||
# Validate config structure
|
||||
is_valid, error_msg = self.validate_config_content(parsed)
|
||||
if not is_valid:
|
||||
raise ValueError(f"Invalid config structure: {error_msg}")
|
||||
# Update fields if provided
|
||||
if title is not None:
|
||||
if not title.strip():
|
||||
raise ValueError("Title cannot be empty")
|
||||
config.title = title.strip()
|
||||
|
||||
# Write updated content
|
||||
with open(filepath, 'w') as f:
|
||||
f.write(yaml_content)
|
||||
if description is not None:
|
||||
config.description = description.strip() if description.strip() else None
|
||||
|
||||
def delete_config(self, filename: str) -> None:
|
||||
"""
|
||||
Delete config file and cascade delete any associated schedules.
|
||||
# Update sites if provided
|
||||
if site_ids is not None:
|
||||
if len(site_ids) == 0:
|
||||
raise ValueError("At least one site must be selected")
|
||||
|
||||
When a config is deleted, all schedules using that config (both enabled
|
||||
and disabled) are automatically deleted as well, since they would be
|
||||
invalid without the config file.
|
||||
# Verify all sites exist
|
||||
existing_sites = self.db.query(Site).filter(Site.id.in_(site_ids)).all()
|
||||
if len(existing_sites) != len(site_ids):
|
||||
found_ids = {s.id for s in existing_sites}
|
||||
missing_ids = set(site_ids) - found_ids
|
||||
raise ValueError(f"Sites not found: {missing_ids}")
|
||||
|
||||
Args:
|
||||
filename: Config filename to delete
|
||||
# Remove existing associations
|
||||
self.db.query(ScanConfigSite).filter_by(config_id=config_id).delete()
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If config doesn't exist
|
||||
"""
|
||||
filepath = os.path.join(self.configs_dir, filename)
|
||||
|
||||
if not os.path.exists(filepath):
|
||||
raise FileNotFoundError(f"Config file '{filename}' not found")
|
||||
|
||||
# Delete any schedules using this config (both enabled and disabled)
|
||||
try:
|
||||
from web.services.schedule_service import ScheduleService
|
||||
from flask import current_app
|
||||
|
||||
# Get database session from Flask app
|
||||
db = current_app.db_session
|
||||
|
||||
# Get all schedules
|
||||
schedule_service = ScheduleService(db)
|
||||
result = schedule_service.list_schedules(page=1, per_page=10000)
|
||||
schedules = result.get('schedules', [])
|
||||
|
||||
# Build full path for comparison
|
||||
config_path = os.path.join(self.configs_dir, filename)
|
||||
|
||||
# Find and delete all schedules using this config (enabled or disabled)
|
||||
deleted_schedules = []
|
||||
for schedule in schedules:
|
||||
schedule_config = schedule.get('config_file', '')
|
||||
|
||||
# Handle both absolute paths and just filenames
|
||||
if schedule_config == filename or schedule_config == config_path:
|
||||
schedule_id = schedule.get('id')
|
||||
schedule_name = schedule.get('name', 'Unknown')
|
||||
try:
|
||||
schedule_service.delete_schedule(schedule_id)
|
||||
deleted_schedules.append(schedule_name)
|
||||
except Exception as e:
|
||||
import logging
|
||||
logging.getLogger(__name__).warning(
|
||||
f"Failed to delete schedule {schedule_id} ('{schedule_name}'): {e}"
|
||||
)
|
||||
|
||||
if deleted_schedules:
|
||||
import logging
|
||||
logging.getLogger(__name__).info(
|
||||
f"Cascade deleted {len(deleted_schedules)} schedule(s) associated with config '{filename}': {', '.join(deleted_schedules)}"
|
||||
# Create new associations
|
||||
for site_id in site_ids:
|
||||
assoc = ScanConfigSite(
|
||||
config_id=config_id,
|
||||
site_id=site_id,
|
||||
created_at=datetime.utcnow()
|
||||
)
|
||||
self.db.add(assoc)
|
||||
|
||||
except ImportError:
|
||||
# If ScheduleService doesn't exist yet, skip schedule deletion
|
||||
pass
|
||||
except Exception as e:
|
||||
# Log error but continue with config deletion
|
||||
import logging
|
||||
logging.getLogger(__name__).error(
|
||||
f"Error deleting schedules for config {filename}: {e}", exc_info=True
|
||||
)
|
||||
config.updated_at = datetime.utcnow()
|
||||
self.db.commit()
|
||||
|
||||
# Delete file
|
||||
os.remove(filepath)
|
||||
return self.get_config_by_id(config_id)
|
||||
|
||||
def validate_config_content(self, content: Dict) -> Tuple[bool, str]:
|
||||
def delete_config(self, config_id: int) -> None:
|
||||
"""
|
||||
Validate parsed YAML config structure.
|
||||
Delete a scan configuration from database.
|
||||
|
||||
This will cascade delete associated ScanConfigSite records.
|
||||
Schedules and scans referencing this config will have their
|
||||
config_id set to NULL.
|
||||
|
||||
Args:
|
||||
content: Parsed YAML config as dict
|
||||
config_id: Configuration ID to delete
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid, error_message)
|
||||
Raises:
|
||||
ValueError: If config not found
|
||||
"""
|
||||
if not isinstance(content, dict):
|
||||
return False, "Config must be a dictionary/object"
|
||||
from web.models import ScanConfig
|
||||
|
||||
# Check required fields
|
||||
if 'title' not in content:
|
||||
return False, "Missing required field: 'title'"
|
||||
config = self.db.query(ScanConfig).filter_by(id=config_id).first()
|
||||
|
||||
if 'sites' not in content:
|
||||
return False, "Missing required field: 'sites'"
|
||||
if not config:
|
||||
raise ValueError(f"Config with ID {config_id} not found")
|
||||
|
||||
# Validate title
|
||||
if not isinstance(content['title'], str) or not content['title'].strip():
|
||||
return False, "Field 'title' must be a non-empty string"
|
||||
self.db.delete(config)
|
||||
self.db.commit()
|
||||
|
||||
# Validate sites
|
||||
sites = content['sites']
|
||||
if not isinstance(sites, list):
|
||||
return False, "Field 'sites' must be a list"
|
||||
|
||||
if len(sites) == 0:
|
||||
return False, "Must have at least one site defined"
|
||||
|
||||
# Validate each site
|
||||
for i, site in enumerate(sites):
|
||||
if not isinstance(site, dict):
|
||||
return False, f"Site {i+1} must be a dictionary/object"
|
||||
|
||||
if 'name' not in site:
|
||||
return False, f"Site {i+1} missing required field: 'name'"
|
||||
|
||||
if 'ips' not in site:
|
||||
return False, f"Site {i+1} missing required field: 'ips'"
|
||||
|
||||
if not isinstance(site['ips'], list):
|
||||
return False, f"Site {i+1} field 'ips' must be a list"
|
||||
|
||||
if len(site['ips']) == 0:
|
||||
return False, f"Site {i+1} must have at least one IP"
|
||||
|
||||
# Validate each IP
|
||||
for j, ip_config in enumerate(site['ips']):
|
||||
if not isinstance(ip_config, dict):
|
||||
return False, f"Site {i+1} IP {j+1} must be a dictionary/object"
|
||||
|
||||
if 'address' not in ip_config:
|
||||
return False, f"Site {i+1} IP {j+1} missing required field: 'address'"
|
||||
|
||||
if 'expected' not in ip_config:
|
||||
return False, f"Site {i+1} IP {j+1} missing required field: 'expected'"
|
||||
|
||||
if not isinstance(ip_config['expected'], dict):
|
||||
return False, f"Site {i+1} IP {j+1} field 'expected' must be a dictionary/object"
|
||||
|
||||
return True, ""
|
||||
|
||||
def get_schedules_using_config(self, filename: str) -> List[str]:
|
||||
def add_site_to_config(self, config_id: int, site_id: int) -> Dict[str, Any]:
|
||||
"""
|
||||
Get list of schedule names using this config.
|
||||
Add a site to an existing config.
|
||||
|
||||
Args:
|
||||
filename: Config filename
|
||||
config_id: Configuration ID
|
||||
site_id: Site ID to add
|
||||
|
||||
Returns:
|
||||
List of schedule names (e.g., ["Daily Scan", "Weekly Audit"])
|
||||
Updated config dictionary
|
||||
|
||||
Raises:
|
||||
ValueError: If config or site not found, or association already exists
|
||||
"""
|
||||
# Import here to avoid circular dependency
|
||||
try:
|
||||
from web.services.schedule_service import ScheduleService
|
||||
from flask import current_app
|
||||
from web.models import ScanConfig, Site, ScanConfigSite
|
||||
|
||||
# Get database session from Flask app
|
||||
db = current_app.db_session
|
||||
config = self.db.query(ScanConfig).filter_by(id=config_id).first()
|
||||
if not config:
|
||||
raise ValueError(f"Config with ID {config_id} not found")
|
||||
|
||||
# Get all schedules (use large per_page to get all)
|
||||
schedule_service = ScheduleService(db)
|
||||
result = schedule_service.list_schedules(page=1, per_page=10000)
|
||||
site = self.db.query(Site).filter_by(id=site_id).first()
|
||||
if not site:
|
||||
raise ValueError(f"Site with ID {site_id} not found")
|
||||
|
||||
# Extract schedules list from paginated result
|
||||
schedules = result.get('schedules', [])
|
||||
# Check if association already exists
|
||||
existing = self.db.query(ScanConfigSite).filter_by(
|
||||
config_id=config_id, site_id=site_id
|
||||
).first()
|
||||
|
||||
# Build full path for comparison
|
||||
config_path = os.path.join(self.configs_dir, filename)
|
||||
if existing:
|
||||
raise ValueError(f"Site '{site.name}' is already in this config")
|
||||
|
||||
# Find schedules using this config (only enabled schedules)
|
||||
using_schedules = []
|
||||
for schedule in schedules:
|
||||
schedule_config = schedule.get('config_file', '')
|
||||
# Create association
|
||||
assoc = ScanConfigSite(
|
||||
config_id=config_id,
|
||||
site_id=site_id,
|
||||
created_at=datetime.utcnow()
|
||||
)
|
||||
self.db.add(assoc)
|
||||
|
||||
# Handle both absolute paths and just filenames
|
||||
if schedule_config == filename or schedule_config == config_path:
|
||||
# Only count enabled schedules
|
||||
if schedule.get('enabled', False):
|
||||
using_schedules.append(schedule.get('name', 'Unknown'))
|
||||
config.updated_at = datetime.utcnow()
|
||||
self.db.commit()
|
||||
|
||||
return using_schedules
|
||||
return self.get_config_by_id(config_id)
|
||||
|
||||
except ImportError:
|
||||
# If ScheduleService doesn't exist yet, return empty list
|
||||
return []
|
||||
except Exception as e:
|
||||
# If any error occurs, return empty list (safer than failing)
|
||||
# Log the error for debugging
|
||||
import logging
|
||||
logging.getLogger(__name__).error(f"Error getting schedules using config {filename}: {e}", exc_info=True)
|
||||
return []
|
||||
|
||||
def generate_filename_from_title(self, title: str) -> str:
|
||||
def remove_site_from_config(self, config_id: int, site_id: int) -> Dict[str, Any]:
|
||||
"""
|
||||
Generate safe filename from scan title.
|
||||
Remove a site from a config.
|
||||
|
||||
Args:
|
||||
title: Scan title string
|
||||
config_id: Configuration ID
|
||||
site_id: Site ID to remove
|
||||
|
||||
Returns:
|
||||
Safe filename (e.g., "Prod Scan 2025" -> "prod-scan-2025.yaml")
|
||||
Updated config dictionary
|
||||
|
||||
Raises:
|
||||
ValueError: If config not found, or removing would leave config empty
|
||||
"""
|
||||
# Convert to lowercase
|
||||
filename = title.lower()
|
||||
from web.models import ScanConfig, ScanConfigSite
|
||||
|
||||
# Replace spaces with hyphens
|
||||
filename = filename.replace(' ', '-')
|
||||
config = self.db.query(ScanConfig).filter_by(id=config_id).first()
|
||||
if not config:
|
||||
raise ValueError(f"Config with ID {config_id} not found")
|
||||
|
||||
# Remove special characters (keep only alphanumeric, hyphens, underscores)
|
||||
filename = re.sub(r'[^a-z0-9\-_]', '', filename)
|
||||
# Check if this would leave the config empty
|
||||
current_site_count = len(config.site_associations)
|
||||
if current_site_count <= 1:
|
||||
raise ValueError("Cannot remove last site from config. Delete the config instead.")
|
||||
|
||||
# Remove consecutive hyphens
|
||||
filename = re.sub(r'-+', '-', filename)
|
||||
# Remove association
|
||||
deleted = self.db.query(ScanConfigSite).filter_by(
|
||||
config_id=config_id, site_id=site_id
|
||||
).delete()
|
||||
|
||||
# Remove leading/trailing hyphens
|
||||
filename = filename.strip('-')
|
||||
if deleted == 0:
|
||||
raise ValueError(f"Site with ID {site_id} is not in this config")
|
||||
|
||||
# Limit length (max 200 chars, reserve 5 for .yaml)
|
||||
max_length = 195
|
||||
if len(filename) > max_length:
|
||||
filename = filename[:max_length]
|
||||
config.updated_at = datetime.utcnow()
|
||||
self.db.commit()
|
||||
|
||||
# Ensure not empty
|
||||
if not filename:
|
||||
filename = 'config'
|
||||
|
||||
# Add .yaml extension
|
||||
filename += '.yaml'
|
||||
|
||||
return filename
|
||||
|
||||
def get_config_path(self, filename: str) -> str:
|
||||
"""
|
||||
Get absolute path for a config file.
|
||||
|
||||
Args:
|
||||
filename: Config filename
|
||||
|
||||
Returns:
|
||||
Absolute path to config file
|
||||
"""
|
||||
return os.path.join(self.configs_dir, filename)
|
||||
|
||||
def config_exists(self, filename: str) -> bool:
|
||||
"""
|
||||
Check if a config file exists.
|
||||
|
||||
Args:
|
||||
filename: Config filename
|
||||
|
||||
Returns:
|
||||
True if file exists, False otherwise
|
||||
"""
|
||||
filepath = os.path.join(self.configs_dir, filename)
|
||||
return os.path.exists(filepath) and os.path.isfile(filepath)
|
||||
return self.get_config_by_id(config_id)
|
||||
|
||||
@@ -16,10 +16,10 @@ from sqlalchemy.orm import Session, joinedload
|
||||
|
||||
from web.models import (
|
||||
Scan, ScanSite, ScanIP, ScanPort, ScanService as ScanServiceModel,
|
||||
ScanCertificate, ScanTLSVersion
|
||||
ScanCertificate, ScanTLSVersion, Site, ScanSiteAssociation, SiteIP
|
||||
)
|
||||
from web.utils.pagination import paginate, PaginatedResult
|
||||
from web.utils.validators import validate_config_file, validate_scan_status
|
||||
from web.utils.validators import validate_scan_status
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -41,8 +41,9 @@ class ScanService:
|
||||
"""
|
||||
self.db = db_session
|
||||
|
||||
def trigger_scan(self, config_file: str, triggered_by: str = 'manual',
|
||||
schedule_id: Optional[int] = None, scheduler=None) -> int:
|
||||
def trigger_scan(self, config_id: int,
|
||||
triggered_by: str = 'manual', schedule_id: Optional[int] = None,
|
||||
scheduler=None) -> int:
|
||||
"""
|
||||
Trigger a new scan.
|
||||
|
||||
@@ -50,7 +51,7 @@ class ScanService:
|
||||
queues the scan for background execution.
|
||||
|
||||
Args:
|
||||
config_file: Path to YAML configuration file
|
||||
config_id: Database config ID
|
||||
triggered_by: Source that triggered scan (manual, scheduled, api)
|
||||
schedule_id: Optional schedule ID if triggered by schedule
|
||||
scheduler: Optional SchedulerService instance for queuing background jobs
|
||||
@@ -59,30 +60,21 @@ class ScanService:
|
||||
Scan ID of the created scan
|
||||
|
||||
Raises:
|
||||
ValueError: If config file is invalid
|
||||
ValueError: If config is invalid
|
||||
"""
|
||||
# Validate config file
|
||||
is_valid, error_msg = validate_config_file(config_file)
|
||||
if not is_valid:
|
||||
raise ValueError(f"Invalid config file: {error_msg}")
|
||||
from web.models import ScanConfig
|
||||
|
||||
# Convert config_file to full path if it's just a filename
|
||||
if not config_file.startswith('/'):
|
||||
config_path = f'/app/configs/{config_file}'
|
||||
else:
|
||||
config_path = config_file
|
||||
# Validate config exists
|
||||
db_config = self.db.query(ScanConfig).filter_by(id=config_id).first()
|
||||
if not db_config:
|
||||
raise ValueError(f"Config with ID {config_id} not found")
|
||||
|
||||
# Load config to get title
|
||||
import yaml
|
||||
with open(config_path, 'r') as f:
|
||||
config = yaml.safe_load(f)
|
||||
|
||||
# Create scan record
|
||||
# Create scan record with config_id
|
||||
scan = Scan(
|
||||
timestamp=datetime.utcnow(),
|
||||
status='running',
|
||||
config_file=config_file,
|
||||
title=config.get('title', 'Untitled Scan'),
|
||||
config_id=config_id,
|
||||
title=db_config.title,
|
||||
triggered_by=triggered_by,
|
||||
schedule_id=schedule_id,
|
||||
created_at=datetime.utcnow()
|
||||
@@ -92,12 +84,12 @@ class ScanService:
|
||||
self.db.commit()
|
||||
self.db.refresh(scan)
|
||||
|
||||
logger.info(f"Scan {scan.id} triggered via {triggered_by}")
|
||||
logger.info(f"Scan {scan.id} triggered via {triggered_by} with config_id={config_id}")
|
||||
|
||||
# Queue background job if scheduler provided
|
||||
if scheduler:
|
||||
try:
|
||||
job_id = scheduler.queue_scan(scan.id, config_file)
|
||||
job_id = scheduler.queue_scan(scan.id, config_id=config_id)
|
||||
logger.info(f"Scan {scan.id} queued for background execution (job_id={job_id})")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to queue scan {scan.id}: {str(e)}")
|
||||
@@ -265,58 +257,128 @@ class ScanService:
|
||||
elif scan.status == 'failed':
|
||||
status_info['progress'] = 'Failed'
|
||||
status_info['error_message'] = scan.error_message
|
||||
elif scan.status == 'cancelled':
|
||||
status_info['progress'] = 'Cancelled'
|
||||
status_info['error_message'] = scan.error_message
|
||||
|
||||
return status_info
|
||||
|
||||
def cleanup_orphaned_scans(self) -> int:
|
||||
def get_scans_by_ip(self, ip_address: str, limit: int = 10) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Clean up orphaned scans that are stuck in 'running' status.
|
||||
Get the last N scans containing a specific IP address.
|
||||
|
||||
Args:
|
||||
ip_address: IP address to search for
|
||||
limit: Maximum number of scans to return (default: 10)
|
||||
|
||||
Returns:
|
||||
List of scan summary dictionaries, most recent first
|
||||
"""
|
||||
scans = (
|
||||
self.db.query(Scan)
|
||||
.join(ScanIP, Scan.id == ScanIP.scan_id)
|
||||
.filter(ScanIP.ip_address == ip_address)
|
||||
.filter(Scan.status == 'completed')
|
||||
.order_by(Scan.timestamp.desc())
|
||||
.limit(limit)
|
||||
.all()
|
||||
)
|
||||
|
||||
return [self._scan_to_summary_dict(scan) for scan in scans]
|
||||
|
||||
def cleanup_orphaned_scans(self) -> dict:
|
||||
"""
|
||||
Clean up orphaned scans with smart recovery.
|
||||
|
||||
For scans stuck in 'running' or 'finalizing' status:
|
||||
- If output files exist: mark as 'completed' (smart recovery)
|
||||
- If no output files: mark as 'failed'
|
||||
|
||||
This should be called on application startup to handle scans that
|
||||
were running when the system crashed or was restarted.
|
||||
|
||||
Scans in 'running' status are marked as 'failed' with an appropriate
|
||||
error message indicating they were orphaned.
|
||||
|
||||
Returns:
|
||||
Number of orphaned scans cleaned up
|
||||
Dictionary with cleanup results: {'recovered': N, 'failed': N, 'total': N}
|
||||
"""
|
||||
# Find all scans with status='running'
|
||||
orphaned_scans = self.db.query(Scan).filter(Scan.status == 'running').all()
|
||||
# Find all scans with status='running' or 'finalizing'
|
||||
orphaned_scans = self.db.query(Scan).filter(
|
||||
Scan.status.in_(['running', 'finalizing'])
|
||||
).all()
|
||||
|
||||
if not orphaned_scans:
|
||||
logger.info("No orphaned scans found")
|
||||
return 0
|
||||
return {'recovered': 0, 'failed': 0, 'total': 0}
|
||||
|
||||
count = len(orphaned_scans)
|
||||
logger.warning(f"Found {count} orphaned scan(s) in 'running' status, marking as failed")
|
||||
logger.warning(f"Found {count} orphaned scan(s), attempting smart recovery")
|
||||
|
||||
recovered_count = 0
|
||||
failed_count = 0
|
||||
output_dir = Path('/app/output')
|
||||
|
||||
# Mark each orphaned scan as failed
|
||||
for scan in orphaned_scans:
|
||||
scan.status = 'failed'
|
||||
# Check for existing output files
|
||||
output_exists = False
|
||||
output_files_found = []
|
||||
|
||||
# Check paths stored in database
|
||||
if scan.json_path and Path(scan.json_path).exists():
|
||||
output_exists = True
|
||||
output_files_found.append('json')
|
||||
if scan.html_path and Path(scan.html_path).exists():
|
||||
output_files_found.append('html')
|
||||
if scan.zip_path and Path(scan.zip_path).exists():
|
||||
output_files_found.append('zip')
|
||||
|
||||
# Also check by timestamp pattern if paths not stored yet
|
||||
if not output_exists and scan.started_at and output_dir.exists():
|
||||
timestamp_pattern = scan.started_at.strftime('%Y%m%d')
|
||||
for json_file in output_dir.glob(f'scan_report_{timestamp_pattern}*.json'):
|
||||
output_exists = True
|
||||
output_files_found.append('json')
|
||||
# Update scan record with found paths
|
||||
scan.json_path = str(json_file)
|
||||
html_file = json_file.with_suffix('.html')
|
||||
if html_file.exists():
|
||||
scan.html_path = str(html_file)
|
||||
output_files_found.append('html')
|
||||
zip_file = json_file.with_suffix('.zip')
|
||||
if zip_file.exists():
|
||||
scan.zip_path = str(zip_file)
|
||||
output_files_found.append('zip')
|
||||
break
|
||||
|
||||
if output_exists:
|
||||
# Smart recovery: outputs exist, mark as completed
|
||||
scan.status = 'completed'
|
||||
scan.error_message = f'Recovered from orphaned state (output files found: {", ".join(output_files_found)})'
|
||||
recovered_count += 1
|
||||
logger.info(f"Recovered orphaned scan {scan.id} as completed (files: {output_files_found})")
|
||||
else:
|
||||
# No outputs: mark as failed
|
||||
scan.status = 'failed'
|
||||
scan.error_message = (
|
||||
"Scan was interrupted by system shutdown or crash. "
|
||||
"No output files were generated."
|
||||
)
|
||||
failed_count += 1
|
||||
logger.info(f"Marked orphaned scan {scan.id} as failed (no output files)")
|
||||
|
||||
scan.completed_at = datetime.utcnow()
|
||||
scan.error_message = (
|
||||
"Scan was interrupted by system shutdown or crash. "
|
||||
"The scan was running but did not complete normally."
|
||||
)
|
||||
|
||||
# Calculate duration if we have a started_at time
|
||||
if scan.started_at:
|
||||
duration = (datetime.utcnow() - scan.started_at).total_seconds()
|
||||
scan.duration = duration
|
||||
|
||||
logger.info(
|
||||
f"Marked orphaned scan {scan.id} as failed "
|
||||
f"(started: {scan.started_at.isoformat() if scan.started_at else 'unknown'})"
|
||||
)
|
||||
scan.duration = (datetime.utcnow() - scan.started_at).total_seconds()
|
||||
|
||||
self.db.commit()
|
||||
logger.info(f"Cleaned up {count} orphaned scan(s)")
|
||||
logger.info(f"Cleaned up {count} orphaned scan(s): {recovered_count} recovered, {failed_count} failed")
|
||||
|
||||
return count
|
||||
return {
|
||||
'recovered': recovered_count,
|
||||
'failed': failed_count,
|
||||
'total': count
|
||||
}
|
||||
|
||||
def _save_scan_to_db(self, report: Dict[str, Any], scan_id: int,
|
||||
status: str = 'completed') -> None:
|
||||
status: str = 'completed', output_paths: Dict = None) -> None:
|
||||
"""
|
||||
Save scan results to database.
|
||||
|
||||
@@ -327,6 +389,7 @@ class ScanService:
|
||||
report: Scan report dictionary from scanner
|
||||
scan_id: Scan ID to update
|
||||
status: Final scan status (completed or failed)
|
||||
output_paths: Dictionary with paths to generated files {'json': Path, 'html': Path, 'zip': Path}
|
||||
"""
|
||||
scan = self.db.query(Scan).filter(Scan.id == scan_id).first()
|
||||
if not scan:
|
||||
@@ -337,6 +400,17 @@ class ScanService:
|
||||
scan.duration = report.get('scan_duration')
|
||||
scan.completed_at = datetime.utcnow()
|
||||
|
||||
# Save output file paths
|
||||
if output_paths:
|
||||
if 'json' in output_paths:
|
||||
scan.json_path = str(output_paths['json'])
|
||||
if 'html' in output_paths:
|
||||
scan.html_path = str(output_paths['html'])
|
||||
if 'zip' in output_paths:
|
||||
scan.zip_path = str(output_paths['zip'])
|
||||
if 'screenshots' in output_paths:
|
||||
scan.screenshot_dir = str(output_paths['screenshots'])
|
||||
|
||||
# Map report data to database models
|
||||
self._map_report_to_models(report, scan)
|
||||
|
||||
@@ -366,6 +440,34 @@ class ScanService:
|
||||
self.db.add(site)
|
||||
self.db.flush() # Get site.id for foreign key
|
||||
|
||||
# Create ScanSiteAssociation if this site exists in the database
|
||||
# This links the scan to reusable site definitions
|
||||
master_site = (
|
||||
self.db.query(Site)
|
||||
.filter(Site.name == site_data['name'])
|
||||
.first()
|
||||
)
|
||||
|
||||
if master_site:
|
||||
# Check if association already exists (avoid duplicates)
|
||||
existing_assoc = (
|
||||
self.db.query(ScanSiteAssociation)
|
||||
.filter(
|
||||
ScanSiteAssociation.scan_id == scan_obj.id,
|
||||
ScanSiteAssociation.site_id == master_site.id
|
||||
)
|
||||
.first()
|
||||
)
|
||||
|
||||
if not existing_assoc:
|
||||
assoc = ScanSiteAssociation(
|
||||
scan_id=scan_obj.id,
|
||||
site_id=master_site.id,
|
||||
created_at=datetime.utcnow()
|
||||
)
|
||||
self.db.add(assoc)
|
||||
logger.debug(f"Created association between scan {scan_obj.id} and site '{master_site.name}' (id={master_site.id})")
|
||||
|
||||
# Process each IP in this site
|
||||
for ip_data in site_data.get('ips', []):
|
||||
# Create ScanIP record
|
||||
@@ -419,9 +521,10 @@ class ScanService:
|
||||
|
||||
# Process certificate and TLS info if present
|
||||
http_info = service_data.get('http_info', {})
|
||||
if http_info.get('certificate'):
|
||||
ssl_tls = http_info.get('ssl_tls', {})
|
||||
if ssl_tls.get('certificate'):
|
||||
self._process_certificate(
|
||||
http_info['certificate'],
|
||||
ssl_tls,
|
||||
scan_obj.id,
|
||||
service.id
|
||||
)
|
||||
@@ -459,16 +562,19 @@ class ScanService:
|
||||
return service
|
||||
return None
|
||||
|
||||
def _process_certificate(self, cert_data: Dict[str, Any], scan_id: int,
|
||||
def _process_certificate(self, ssl_tls_data: Dict[str, Any], scan_id: int,
|
||||
service_id: int) -> None:
|
||||
"""
|
||||
Process certificate and TLS version data.
|
||||
|
||||
Args:
|
||||
cert_data: Certificate data dictionary
|
||||
ssl_tls_data: SSL/TLS data dictionary containing 'certificate' and 'tls_versions'
|
||||
scan_id: Scan ID
|
||||
service_id: Service ID
|
||||
"""
|
||||
# Extract certificate data from ssl_tls structure
|
||||
cert_data = ssl_tls_data.get('certificate', {})
|
||||
|
||||
# Create ScanCertificate record
|
||||
cert = ScanCertificate(
|
||||
scan_id=scan_id,
|
||||
@@ -486,7 +592,7 @@ class ScanService:
|
||||
self.db.flush()
|
||||
|
||||
# Process TLS versions
|
||||
tls_versions = cert_data.get('tls_versions', {})
|
||||
tls_versions = ssl_tls_data.get('tls_versions', {})
|
||||
for version, version_data in tls_versions.items():
|
||||
tls = ScanTLSVersion(
|
||||
scan_id=scan_id,
|
||||
@@ -535,7 +641,7 @@ class ScanService:
|
||||
'duration': scan.duration,
|
||||
'status': scan.status,
|
||||
'title': scan.title,
|
||||
'config_file': scan.config_file,
|
||||
'config_id': scan.config_id,
|
||||
'json_path': scan.json_path,
|
||||
'html_path': scan.html_path,
|
||||
'zip_path': scan.zip_path,
|
||||
@@ -561,24 +667,54 @@ class ScanService:
|
||||
'duration': scan.duration,
|
||||
'status': scan.status,
|
||||
'title': scan.title,
|
||||
'config_file': scan.config_file,
|
||||
'config_id': scan.config_id,
|
||||
'triggered_by': scan.triggered_by,
|
||||
'created_at': scan.created_at.isoformat() if scan.created_at else None
|
||||
}
|
||||
|
||||
def _site_to_dict(self, site: ScanSite) -> Dict[str, Any]:
|
||||
"""Convert ScanSite to dictionary."""
|
||||
# Look up the master Site ID from ScanSiteAssociation
|
||||
master_site_id = None
|
||||
assoc = (
|
||||
self.db.query(ScanSiteAssociation)
|
||||
.filter(
|
||||
ScanSiteAssociation.scan_id == site.scan_id,
|
||||
)
|
||||
.join(Site)
|
||||
.filter(Site.name == site.site_name)
|
||||
.first()
|
||||
)
|
||||
if assoc:
|
||||
master_site_id = assoc.site_id
|
||||
|
||||
return {
|
||||
'id': site.id,
|
||||
'name': site.site_name,
|
||||
'ips': [self._ip_to_dict(ip) for ip in site.ips]
|
||||
'site_id': master_site_id, # The actual Site ID for config updates
|
||||
'ips': [self._ip_to_dict(ip, master_site_id) for ip in site.ips]
|
||||
}
|
||||
|
||||
def _ip_to_dict(self, ip: ScanIP) -> Dict[str, Any]:
|
||||
def _ip_to_dict(self, ip: ScanIP, site_id: Optional[int] = None) -> Dict[str, Any]:
|
||||
"""Convert ScanIP to dictionary."""
|
||||
# Look up the SiteIP ID for this IP address in the master Site
|
||||
site_ip_id = None
|
||||
if site_id:
|
||||
site_ip = (
|
||||
self.db.query(SiteIP)
|
||||
.filter(
|
||||
SiteIP.site_id == site_id,
|
||||
SiteIP.ip_address == ip.ip_address
|
||||
)
|
||||
.first()
|
||||
)
|
||||
if site_ip:
|
||||
site_ip_id = site_ip.id
|
||||
|
||||
return {
|
||||
'id': ip.id,
|
||||
'address': ip.ip_address,
|
||||
'site_ip_id': site_ip_id, # The actual SiteIP ID for config updates
|
||||
'ping_expected': ip.ping_expected,
|
||||
'ping_actual': ip.ping_actual,
|
||||
'ports': [self._port_to_dict(port) for port in ip.ports]
|
||||
@@ -704,17 +840,17 @@ class ScanService:
|
||||
return None
|
||||
|
||||
# Check if scans use the same configuration
|
||||
config1 = scan1.get('config_file', '')
|
||||
config2 = scan2.get('config_file', '')
|
||||
same_config = (config1 == config2) and (config1 != '')
|
||||
config1 = scan1.get('config_id')
|
||||
config2 = scan2.get('config_id')
|
||||
same_config = (config1 == config2) and (config1 is not None)
|
||||
|
||||
# Generate warning message if configs differ
|
||||
config_warning = None
|
||||
if not same_config:
|
||||
config_warning = (
|
||||
f"These scans use different configurations. "
|
||||
f"Scan #{scan1_id} used '{config1 or 'unknown'}' and "
|
||||
f"Scan #{scan2_id} used '{config2 or 'unknown'}'. "
|
||||
f"Scan #{scan1_id} used config_id={config1 or 'unknown'} and "
|
||||
f"Scan #{scan2_id} used config_id={config2 or 'unknown'}. "
|
||||
f"The comparison may show all changes as additions/removals if the scans "
|
||||
f"cover different IP ranges or infrastructure."
|
||||
)
|
||||
@@ -753,14 +889,14 @@ class ScanService:
|
||||
'timestamp': scan1['timestamp'],
|
||||
'title': scan1['title'],
|
||||
'status': scan1['status'],
|
||||
'config_file': config1
|
||||
'config_id': config1
|
||||
},
|
||||
'scan2': {
|
||||
'id': scan2['id'],
|
||||
'timestamp': scan2['timestamp'],
|
||||
'title': scan2['title'],
|
||||
'status': scan2['status'],
|
||||
'config_file': config2
|
||||
'config_id': config2
|
||||
},
|
||||
'same_config': same_config,
|
||||
'config_warning': config_warning,
|
||||
|
||||
@@ -6,14 +6,13 @@ scheduled scans with cron expressions.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
from datetime import datetime
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
from croniter import croniter
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from web.models import Schedule, Scan
|
||||
from web.models import Schedule, Scan, ScanConfig
|
||||
from web.utils.pagination import paginate, PaginatedResult
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -39,7 +38,7 @@ class ScheduleService:
|
||||
def create_schedule(
|
||||
self,
|
||||
name: str,
|
||||
config_file: str,
|
||||
config_id: int,
|
||||
cron_expression: str,
|
||||
enabled: bool = True
|
||||
) -> int:
|
||||
@@ -48,7 +47,7 @@ class ScheduleService:
|
||||
|
||||
Args:
|
||||
name: Human-readable schedule name
|
||||
config_file: Path to YAML configuration file
|
||||
config_id: Database config ID
|
||||
cron_expression: Cron expression (e.g., '0 2 * * *')
|
||||
enabled: Whether schedule is active
|
||||
|
||||
@@ -56,36 +55,32 @@ class ScheduleService:
|
||||
Schedule ID of the created schedule
|
||||
|
||||
Raises:
|
||||
ValueError: If cron expression is invalid or config file doesn't exist
|
||||
ValueError: If cron expression is invalid or config doesn't exist
|
||||
"""
|
||||
# Validate cron expression
|
||||
is_valid, error_msg = self.validate_cron_expression(cron_expression)
|
||||
if not is_valid:
|
||||
raise ValueError(f"Invalid cron expression: {error_msg}")
|
||||
|
||||
# Validate config file exists
|
||||
# If config_file is just a filename, prepend the configs directory
|
||||
if not config_file.startswith('/'):
|
||||
config_file_path = os.path.join('/app/configs', config_file)
|
||||
else:
|
||||
config_file_path = config_file
|
||||
|
||||
if not os.path.isfile(config_file_path):
|
||||
raise ValueError(f"Config file not found: {config_file}")
|
||||
# Validate config exists
|
||||
db_config = self.db.query(ScanConfig).filter_by(id=config_id).first()
|
||||
if not db_config:
|
||||
raise ValueError(f"Config with ID {config_id} not found")
|
||||
|
||||
# Calculate next run time
|
||||
next_run = self.calculate_next_run(cron_expression) if enabled else None
|
||||
|
||||
# Create schedule record
|
||||
now_utc = datetime.now(timezone.utc)
|
||||
schedule = Schedule(
|
||||
name=name,
|
||||
config_file=config_file,
|
||||
config_id=config_id,
|
||||
cron_expression=cron_expression,
|
||||
enabled=enabled,
|
||||
last_run=None,
|
||||
next_run=next_run,
|
||||
created_at=datetime.utcnow(),
|
||||
updated_at=datetime.utcnow()
|
||||
created_at=now_utc,
|
||||
updated_at=now_utc
|
||||
)
|
||||
|
||||
self.db.add(schedule)
|
||||
@@ -109,7 +104,14 @@ class ScheduleService:
|
||||
Raises:
|
||||
ValueError: If schedule not found
|
||||
"""
|
||||
schedule = self.db.query(Schedule).filter(Schedule.id == schedule_id).first()
|
||||
from sqlalchemy.orm import joinedload
|
||||
|
||||
schedule = (
|
||||
self.db.query(Schedule)
|
||||
.options(joinedload(Schedule.config))
|
||||
.filter(Schedule.id == schedule_id)
|
||||
.first()
|
||||
)
|
||||
|
||||
if not schedule:
|
||||
raise ValueError(f"Schedule {schedule_id} not found")
|
||||
@@ -144,8 +146,10 @@ class ScheduleService:
|
||||
'pages': int
|
||||
}
|
||||
"""
|
||||
# Build query
|
||||
query = self.db.query(Schedule)
|
||||
from sqlalchemy.orm import joinedload
|
||||
|
||||
# Build query and eagerly load config relationship
|
||||
query = self.db.query(Schedule).options(joinedload(Schedule.config))
|
||||
|
||||
# Apply filter
|
||||
if enabled_filter is not None:
|
||||
@@ -200,17 +204,11 @@ class ScheduleService:
|
||||
if schedule.enabled or updates.get('enabled', False):
|
||||
updates['next_run'] = self.calculate_next_run(updates['cron_expression'])
|
||||
|
||||
# Validate config file if being updated
|
||||
if 'config_file' in updates:
|
||||
config_file = updates['config_file']
|
||||
# If config_file is just a filename, prepend the configs directory
|
||||
if not config_file.startswith('/'):
|
||||
config_file_path = os.path.join('/app/configs', config_file)
|
||||
else:
|
||||
config_file_path = config_file
|
||||
|
||||
if not os.path.isfile(config_file_path):
|
||||
raise ValueError(f"Config file not found: {updates['config_file']}")
|
||||
# Validate config_id if being updated
|
||||
if 'config_id' in updates:
|
||||
db_config = self.db.query(ScanConfig).filter_by(id=updates['config_id']).first()
|
||||
if not db_config:
|
||||
raise ValueError(f"Config with ID {updates['config_id']} not found")
|
||||
|
||||
# Handle enabled toggle
|
||||
if 'enabled' in updates:
|
||||
@@ -227,7 +225,7 @@ class ScheduleService:
|
||||
if hasattr(schedule, key):
|
||||
setattr(schedule, key, value)
|
||||
|
||||
schedule.updated_at = datetime.utcnow()
|
||||
schedule.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
self.db.commit()
|
||||
self.db.refresh(schedule)
|
||||
@@ -310,7 +308,7 @@ class ScheduleService:
|
||||
|
||||
schedule.last_run = last_run
|
||||
schedule.next_run = next_run
|
||||
schedule.updated_at = datetime.utcnow()
|
||||
schedule.updated_at = datetime.now(timezone.utc)
|
||||
|
||||
self.db.commit()
|
||||
|
||||
@@ -323,23 +321,43 @@ class ScheduleService:
|
||||
Validate a cron expression.
|
||||
|
||||
Args:
|
||||
cron_expr: Cron expression to validate
|
||||
cron_expr: Cron expression to validate in standard crontab format
|
||||
Format: minute hour day month day_of_week
|
||||
Day of week: 0=Sunday, 1=Monday, ..., 6=Saturday
|
||||
(APScheduler will convert this to its internal format automatically)
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid, error_message)
|
||||
- (True, None) if valid
|
||||
- (False, error_message) if invalid
|
||||
|
||||
Note:
|
||||
This validates using croniter which uses standard crontab format.
|
||||
APScheduler's from_crontab() will handle the conversion when the
|
||||
schedule is registered with the scheduler.
|
||||
"""
|
||||
try:
|
||||
# Try to create a croniter instance
|
||||
base_time = datetime.utcnow()
|
||||
# croniter uses standard crontab format (Sunday=0)
|
||||
from datetime import timezone
|
||||
base_time = datetime.now(timezone.utc)
|
||||
cron = croniter(cron_expr, base_time)
|
||||
|
||||
# Try to get the next run time (validates the expression)
|
||||
cron.get_next(datetime)
|
||||
|
||||
# Validate basic format (5 fields)
|
||||
fields = cron_expr.split()
|
||||
if len(fields) != 5:
|
||||
return (False, f"Cron expression must have 5 fields (minute hour day month day_of_week), got {len(fields)}")
|
||||
|
||||
return (True, None)
|
||||
except (ValueError, KeyError) as e:
|
||||
error_msg = str(e)
|
||||
# Add helpful hint for day_of_week errors
|
||||
if "day" in error_msg.lower() and len(cron_expr.split()) >= 5:
|
||||
hint = "\nNote: Use standard crontab format where 0=Sunday, 1=Monday, ..., 6=Saturday"
|
||||
return (False, f"{error_msg}{hint}")
|
||||
return (False, str(e))
|
||||
except Exception as e:
|
||||
return (False, f"Unexpected error: {str(e)}")
|
||||
@@ -357,17 +375,24 @@ class ScheduleService:
|
||||
from_time: Base time (defaults to now UTC)
|
||||
|
||||
Returns:
|
||||
Next run datetime (UTC)
|
||||
Next run datetime (UTC, timezone-aware)
|
||||
|
||||
Raises:
|
||||
ValueError: If cron expression is invalid
|
||||
"""
|
||||
if from_time is None:
|
||||
from_time = datetime.utcnow()
|
||||
from_time = datetime.now(timezone.utc)
|
||||
|
||||
try:
|
||||
cron = croniter(cron_expr, from_time)
|
||||
return cron.get_next(datetime)
|
||||
next_run = cron.get_next(datetime)
|
||||
|
||||
# croniter returns naive datetime, so we need to add timezone info
|
||||
# Since we're using UTC for all calculations, add UTC timezone
|
||||
if next_run.tzinfo is None:
|
||||
next_run = next_run.replace(tzinfo=timezone.utc)
|
||||
|
||||
return next_run
|
||||
except Exception as e:
|
||||
raise ValueError(f"Invalid cron expression '{cron_expr}': {str(e)}")
|
||||
|
||||
@@ -400,7 +425,7 @@ class ScheduleService:
|
||||
'timestamp': scan.timestamp.isoformat() if scan.timestamp else None,
|
||||
'status': scan.status,
|
||||
'title': scan.title,
|
||||
'config_file': scan.config_file
|
||||
'config_id': scan.config_id
|
||||
}
|
||||
for scan in scans
|
||||
]
|
||||
@@ -415,10 +440,16 @@ class ScheduleService:
|
||||
Returns:
|
||||
Dictionary representation
|
||||
"""
|
||||
# Get config title if relationship is loaded
|
||||
config_name = None
|
||||
if schedule.config:
|
||||
config_name = schedule.config.title
|
||||
|
||||
return {
|
||||
'id': schedule.id,
|
||||
'name': schedule.name,
|
||||
'config_file': schedule.config_file,
|
||||
'config_id': schedule.config_id,
|
||||
'config_name': config_name,
|
||||
'cron_expression': schedule.cron_expression,
|
||||
'enabled': schedule.enabled,
|
||||
'last_run': schedule.last_run.isoformat() if schedule.last_run else None,
|
||||
@@ -433,7 +464,7 @@ class ScheduleService:
|
||||
Format datetime as relative time.
|
||||
|
||||
Args:
|
||||
dt: Datetime to format (UTC)
|
||||
dt: Datetime to format (UTC, can be naive or aware)
|
||||
|
||||
Returns:
|
||||
Human-readable relative time (e.g., "in 2 hours", "yesterday")
|
||||
@@ -441,7 +472,13 @@ class ScheduleService:
|
||||
if dt is None:
|
||||
return None
|
||||
|
||||
now = datetime.utcnow()
|
||||
# Ensure both datetimes are timezone-aware for comparison
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
# If dt is naive, assume it's UTC and add timezone info
|
||||
if dt.tzinfo is None:
|
||||
dt = dt.replace(tzinfo=timezone.utc)
|
||||
|
||||
diff = dt - now
|
||||
|
||||
# Future times
|
||||
|
||||
@@ -131,7 +131,7 @@ class SchedulerService:
|
||||
try:
|
||||
self.add_scheduled_scan(
|
||||
schedule_id=schedule.id,
|
||||
config_file=schedule.config_file,
|
||||
config_id=schedule.config_id,
|
||||
cron_expression=schedule.cron_expression
|
||||
)
|
||||
logger.info(f"Loaded schedule {schedule.id}: '{schedule.name}'")
|
||||
@@ -149,13 +149,58 @@ class SchedulerService:
|
||||
except Exception as e:
|
||||
logger.error(f"Error loading schedules on startup: {str(e)}", exc_info=True)
|
||||
|
||||
def queue_scan(self, scan_id: int, config_file: str) -> str:
|
||||
@staticmethod
|
||||
def validate_cron_expression(cron_expression: str) -> tuple[bool, str]:
|
||||
"""
|
||||
Validate a cron expression and provide helpful feedback.
|
||||
|
||||
Args:
|
||||
cron_expression: Cron expression to validate
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid: bool, message: str)
|
||||
- If valid: (True, "Valid cron expression")
|
||||
- If invalid: (False, "Error message with details")
|
||||
|
||||
Note:
|
||||
Standard crontab format: minute hour day month day_of_week
|
||||
Day of week: 0=Sunday, 1=Monday, ..., 6=Saturday (or 7=Sunday)
|
||||
"""
|
||||
from apscheduler.triggers.cron import CronTrigger
|
||||
|
||||
try:
|
||||
# Try to parse the expression
|
||||
trigger = CronTrigger.from_crontab(cron_expression)
|
||||
|
||||
# Validate basic format (5 fields)
|
||||
fields = cron_expression.split()
|
||||
if len(fields) != 5:
|
||||
return False, f"Cron expression must have 5 fields (minute hour day month day_of_week), got {len(fields)}"
|
||||
|
||||
return True, "Valid cron expression"
|
||||
|
||||
except (ValueError, KeyError) as e:
|
||||
error_msg = str(e)
|
||||
|
||||
# Provide helpful hints for common errors
|
||||
if "day_of_week" in error_msg.lower() or (len(cron_expression.split()) >= 5):
|
||||
# Check if day_of_week field might be using APScheduler format by mistake
|
||||
fields = cron_expression.split()
|
||||
if len(fields) == 5:
|
||||
dow_field = fields[4]
|
||||
if dow_field.isdigit() and int(dow_field) >= 0:
|
||||
hint = "\nNote: Use standard crontab format where 0=Sunday, 1=Monday, ..., 6=Saturday"
|
||||
return False, f"Invalid cron expression: {error_msg}{hint}"
|
||||
|
||||
return False, f"Invalid cron expression: {error_msg}"
|
||||
|
||||
def queue_scan(self, scan_id: int, config_id: int) -> str:
|
||||
"""
|
||||
Queue a scan for immediate background execution.
|
||||
|
||||
Args:
|
||||
scan_id: Database ID of the scan
|
||||
config_file: Path to YAML configuration file
|
||||
config_id: Database config ID
|
||||
|
||||
Returns:
|
||||
Job ID from APScheduler
|
||||
@@ -169,7 +214,7 @@ class SchedulerService:
|
||||
# Add job to run immediately
|
||||
job = self.scheduler.add_job(
|
||||
func=execute_scan,
|
||||
args=[scan_id, config_file, self.db_url],
|
||||
kwargs={'scan_id': scan_id, 'config_id': config_id, 'db_url': self.db_url},
|
||||
id=f'scan_{scan_id}',
|
||||
name=f'Scan {scan_id}',
|
||||
replace_existing=True,
|
||||
@@ -179,15 +224,19 @@ class SchedulerService:
|
||||
logger.info(f"Queued scan {scan_id} for background execution (job_id={job.id})")
|
||||
return job.id
|
||||
|
||||
def add_scheduled_scan(self, schedule_id: int, config_file: str,
|
||||
def add_scheduled_scan(self, schedule_id: int, config_id: int,
|
||||
cron_expression: str) -> str:
|
||||
"""
|
||||
Add a recurring scheduled scan.
|
||||
|
||||
Args:
|
||||
schedule_id: Database ID of the schedule
|
||||
config_file: Path to YAML configuration file
|
||||
config_id: Database config ID
|
||||
cron_expression: Cron expression (e.g., "0 2 * * *" for 2am daily)
|
||||
IMPORTANT: Use standard crontab format where:
|
||||
- Day of week: 0 = Sunday, 1 = Monday, ..., 6 = Saturday
|
||||
- APScheduler automatically converts to its internal format
|
||||
- from_crontab() handles the conversion properly
|
||||
|
||||
Returns:
|
||||
Job ID from APScheduler
|
||||
@@ -195,18 +244,29 @@ class SchedulerService:
|
||||
Raises:
|
||||
RuntimeError: If scheduler not initialized
|
||||
ValueError: If cron expression is invalid
|
||||
|
||||
Note:
|
||||
APScheduler internally uses Monday=0, but from_crontab() accepts
|
||||
standard crontab format (Sunday=0) and converts it automatically.
|
||||
"""
|
||||
if not self.scheduler:
|
||||
raise RuntimeError("Scheduler not initialized. Call init_scheduler() first.")
|
||||
|
||||
from apscheduler.triggers.cron import CronTrigger
|
||||
|
||||
# Validate cron expression first to provide helpful error messages
|
||||
is_valid, message = self.validate_cron_expression(cron_expression)
|
||||
if not is_valid:
|
||||
raise ValueError(message)
|
||||
|
||||
# Create cron trigger from expression using local timezone
|
||||
# This allows users to specify times in their local timezone
|
||||
# from_crontab() parses standard crontab format (Sunday=0)
|
||||
# and converts to APScheduler's internal format (Monday=0) automatically
|
||||
try:
|
||||
trigger = CronTrigger.from_crontab(cron_expression)
|
||||
# timezone defaults to local system timezone
|
||||
except (ValueError, KeyError) as e:
|
||||
# This should not happen due to validation above, but catch anyway
|
||||
raise ValueError(f"Invalid cron expression '{cron_expression}': {str(e)}")
|
||||
|
||||
# Add cron job
|
||||
@@ -283,22 +343,27 @@ class SchedulerService:
|
||||
# Create and trigger scan
|
||||
scan_service = ScanService(session)
|
||||
scan_id = scan_service.trigger_scan(
|
||||
config_file=schedule['config_file'],
|
||||
config_id=schedule['config_id'],
|
||||
triggered_by='scheduled',
|
||||
schedule_id=schedule_id,
|
||||
scheduler=None # Don't pass scheduler to avoid recursion
|
||||
)
|
||||
|
||||
# Queue the scan for execution
|
||||
self.queue_scan(scan_id, schedule['config_file'])
|
||||
self.queue_scan(scan_id, schedule['config_id'])
|
||||
|
||||
# Update schedule's last_run and next_run
|
||||
from croniter import croniter
|
||||
next_run = croniter(schedule['cron_expression'], datetime.utcnow()).get_next(datetime)
|
||||
now_utc = datetime.now(timezone.utc)
|
||||
next_run = croniter(schedule['cron_expression'], now_utc).get_next(datetime)
|
||||
|
||||
# croniter returns naive datetime, add UTC timezone
|
||||
if next_run.tzinfo is None:
|
||||
next_run = next_run.replace(tzinfo=timezone.utc)
|
||||
|
||||
schedule_service.update_run_times(
|
||||
schedule_id=schedule_id,
|
||||
last_run=datetime.utcnow(),
|
||||
last_run=now_utc,
|
||||
next_run=next_run
|
||||
)
|
||||
|
||||
|
||||
683
app/web/services/site_service.py
Normal file
683
app/web/services/site_service.py
Normal file
@@ -0,0 +1,683 @@
|
||||
"""
|
||||
Site service for managing reusable site definitions.
|
||||
|
||||
This service handles the business logic for creating, updating, and managing
|
||||
sites with their associated CIDR ranges and IP-level overrides.
|
||||
"""
|
||||
|
||||
import ipaddress
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from sqlalchemy import func
|
||||
from sqlalchemy.orm import Session, joinedload
|
||||
|
||||
from web.models import (
|
||||
Site, SiteIP, ScanSiteAssociation
|
||||
)
|
||||
from web.utils.pagination import paginate, PaginatedResult
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SiteService:
|
||||
"""
|
||||
Service for managing reusable site definitions.
|
||||
|
||||
Handles site lifecycle: creation, updates, deletion (with safety checks),
|
||||
CIDR management, and IP-level overrides.
|
||||
"""
|
||||
|
||||
def __init__(self, db_session: Session):
|
||||
"""
|
||||
Initialize site service.
|
||||
|
||||
Args:
|
||||
db_session: SQLAlchemy database session
|
||||
"""
|
||||
self.db = db_session
|
||||
|
||||
def create_site(self, name: str, description: Optional[str] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Create a new site.
|
||||
|
||||
Args:
|
||||
name: Unique site name
|
||||
description: Optional site description
|
||||
|
||||
Returns:
|
||||
Dictionary with created site data
|
||||
|
||||
Raises:
|
||||
ValueError: If site name already exists
|
||||
"""
|
||||
# Validate site name is unique
|
||||
existing = self.db.query(Site).filter(Site.name == name).first()
|
||||
if existing:
|
||||
raise ValueError(f"Site with name '{name}' already exists")
|
||||
|
||||
# Create site (can be empty, IPs added separately)
|
||||
site = Site(
|
||||
name=name,
|
||||
description=description,
|
||||
created_at=datetime.utcnow(),
|
||||
updated_at=datetime.utcnow()
|
||||
)
|
||||
|
||||
self.db.add(site)
|
||||
self.db.commit()
|
||||
self.db.refresh(site)
|
||||
|
||||
logger.info(f"Created site '{name}' (id={site.id})")
|
||||
|
||||
return self._site_to_dict(site)
|
||||
|
||||
def update_site(self, site_id: int, name: Optional[str] = None,
|
||||
description: Optional[str] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Update site metadata (name and/or description).
|
||||
|
||||
Args:
|
||||
site_id: Site ID to update
|
||||
name: New site name (must be unique)
|
||||
description: New description
|
||||
|
||||
Returns:
|
||||
Dictionary with updated site data
|
||||
|
||||
Raises:
|
||||
ValueError: If site not found or name already exists
|
||||
"""
|
||||
site = self.db.query(Site).filter(Site.id == site_id).first()
|
||||
if not site:
|
||||
raise ValueError(f"Site with id {site_id} not found")
|
||||
|
||||
# Update name if provided
|
||||
if name is not None and name != site.name:
|
||||
# Check uniqueness
|
||||
existing = self.db.query(Site).filter(
|
||||
Site.name == name,
|
||||
Site.id != site_id
|
||||
).first()
|
||||
if existing:
|
||||
raise ValueError(f"Site with name '{name}' already exists")
|
||||
site.name = name
|
||||
|
||||
# Update description if provided
|
||||
if description is not None:
|
||||
site.description = description
|
||||
|
||||
site.updated_at = datetime.utcnow()
|
||||
|
||||
self.db.commit()
|
||||
self.db.refresh(site)
|
||||
|
||||
logger.info(f"Updated site {site_id} ('{site.name}')")
|
||||
|
||||
return self._site_to_dict(site)
|
||||
|
||||
def delete_site(self, site_id: int) -> None:
|
||||
"""
|
||||
Delete a site.
|
||||
|
||||
Prevents deletion if the site is used in any scan (per user requirement).
|
||||
|
||||
Args:
|
||||
site_id: Site ID to delete
|
||||
|
||||
Raises:
|
||||
ValueError: If site not found or is used in scans
|
||||
"""
|
||||
site = self.db.query(Site).filter(Site.id == site_id).first()
|
||||
if not site:
|
||||
raise ValueError(f"Site with id {site_id} not found")
|
||||
|
||||
# Check if site is used in any scans
|
||||
usage_count = (
|
||||
self.db.query(func.count(ScanSiteAssociation.id))
|
||||
.filter(ScanSiteAssociation.site_id == site_id)
|
||||
.scalar()
|
||||
)
|
||||
|
||||
if usage_count > 0:
|
||||
raise ValueError(
|
||||
f"Cannot delete site '{site.name}': it is used in {usage_count} scan(s). "
|
||||
f"Sites that have been used in scans cannot be deleted."
|
||||
)
|
||||
|
||||
# Safe to delete
|
||||
self.db.delete(site)
|
||||
self.db.commit()
|
||||
|
||||
logger.info(f"Deleted site {site_id} ('{site.name}')")
|
||||
|
||||
def get_site(self, site_id: int) -> Optional[Dict[str, Any]]:
|
||||
"""
|
||||
Get site details.
|
||||
|
||||
Args:
|
||||
site_id: Site ID to retrieve
|
||||
|
||||
Returns:
|
||||
Dictionary with site data, or None if not found
|
||||
"""
|
||||
site = (
|
||||
self.db.query(Site)
|
||||
.filter(Site.id == site_id)
|
||||
.first()
|
||||
)
|
||||
|
||||
if not site:
|
||||
return None
|
||||
|
||||
return self._site_to_dict(site)
|
||||
|
||||
def get_site_by_name(self, name: str) -> Optional[Dict[str, Any]]:
|
||||
"""
|
||||
Get site details by name.
|
||||
|
||||
Args:
|
||||
name: Site name to retrieve
|
||||
|
||||
Returns:
|
||||
Dictionary with site data, or None if not found
|
||||
"""
|
||||
site = (
|
||||
self.db.query(Site)
|
||||
.filter(Site.name == name)
|
||||
.first()
|
||||
)
|
||||
|
||||
if not site:
|
||||
return None
|
||||
|
||||
return self._site_to_dict(site)
|
||||
|
||||
def list_sites(self, page: int = 1, per_page: int = 20) -> PaginatedResult:
|
||||
"""
|
||||
List all sites with pagination.
|
||||
|
||||
Args:
|
||||
page: Page number (1-indexed)
|
||||
per_page: Number of items per page
|
||||
|
||||
Returns:
|
||||
PaginatedResult with site data
|
||||
"""
|
||||
query = (
|
||||
self.db.query(Site)
|
||||
.order_by(Site.name)
|
||||
)
|
||||
|
||||
return paginate(query, page, per_page, self._site_to_dict)
|
||||
|
||||
def list_all_sites(self) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
List all sites without pagination (for dropdowns, etc.).
|
||||
|
||||
Returns:
|
||||
List of site dictionaries
|
||||
"""
|
||||
sites = (
|
||||
self.db.query(Site)
|
||||
.order_by(Site.name)
|
||||
.all()
|
||||
)
|
||||
|
||||
return [self._site_to_dict(site) for site in sites]
|
||||
|
||||
def get_global_ip_stats(self) -> Dict[str, int]:
|
||||
"""
|
||||
Get global IP statistics across all sites.
|
||||
|
||||
Returns:
|
||||
Dictionary with:
|
||||
- total_ips: Total count of IP entries (including duplicates)
|
||||
- unique_ips: Count of distinct IP addresses
|
||||
- duplicate_ips: Number of duplicate entries (total - unique)
|
||||
"""
|
||||
# Total IP entries
|
||||
total_ips = (
|
||||
self.db.query(func.count(SiteIP.id))
|
||||
.scalar() or 0
|
||||
)
|
||||
|
||||
# Unique IP addresses
|
||||
unique_ips = (
|
||||
self.db.query(func.count(func.distinct(SiteIP.ip_address)))
|
||||
.scalar() or 0
|
||||
)
|
||||
|
||||
return {
|
||||
'total_ips': total_ips,
|
||||
'unique_ips': unique_ips,
|
||||
'duplicate_ips': total_ips - unique_ips
|
||||
}
|
||||
|
||||
def bulk_add_ips_from_cidr(self, site_id: int, cidr: str,
|
||||
expected_ping: Optional[bool] = None,
|
||||
expected_tcp_ports: Optional[List[int]] = None,
|
||||
expected_udp_ports: Optional[List[int]] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Expand a CIDR range and add all IPs to a site.
|
||||
|
||||
CIDRs are NOT stored - they are just used to generate IP records.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
cidr: CIDR notation (e.g., "10.0.0.0/24")
|
||||
expected_ping: Expected ping response for all IPs
|
||||
expected_tcp_ports: List of expected TCP ports for all IPs
|
||||
expected_udp_ports: List of expected UDP ports for all IPs
|
||||
|
||||
Returns:
|
||||
Dictionary with:
|
||||
- cidr: The CIDR that was expanded
|
||||
- ip_count: Number of IPs created
|
||||
- ips_added: List of IP addresses created
|
||||
- ips_skipped: List of IPs that already existed
|
||||
|
||||
Raises:
|
||||
ValueError: If site not found or CIDR is invalid/too large
|
||||
"""
|
||||
site = self.db.query(Site).filter(Site.id == site_id).first()
|
||||
if not site:
|
||||
raise ValueError(f"Site with id {site_id} not found")
|
||||
|
||||
# Validate CIDR format and size
|
||||
try:
|
||||
network = ipaddress.ip_network(cidr, strict=False)
|
||||
except ValueError as e:
|
||||
raise ValueError(f"Invalid CIDR notation '{cidr}': {str(e)}")
|
||||
|
||||
# Enforce CIDR size limits (max /24 for IPv4, /64 for IPv6)
|
||||
if isinstance(network, ipaddress.IPv4Network) and network.prefixlen < 24:
|
||||
raise ValueError(
|
||||
f"CIDR '{cidr}' is too large ({network.num_addresses} IPs). "
|
||||
f"Maximum allowed is /24 (256 IPs) for IPv4."
|
||||
)
|
||||
elif isinstance(network, ipaddress.IPv6Network) and network.prefixlen < 64:
|
||||
raise ValueError(
|
||||
f"CIDR '{cidr}' is too large. "
|
||||
f"Maximum allowed is /64 for IPv6."
|
||||
)
|
||||
|
||||
# Expand CIDR to individual IPs (no cidr_id since we're not storing CIDR)
|
||||
ip_count, ips_added, ips_skipped = self._expand_cidr_to_ips(
|
||||
site_id=site_id,
|
||||
network=network,
|
||||
expected_ping=expected_ping,
|
||||
expected_tcp_ports=expected_tcp_ports or [],
|
||||
expected_udp_ports=expected_udp_ports or []
|
||||
)
|
||||
|
||||
site.updated_at = datetime.utcnow()
|
||||
self.db.commit()
|
||||
|
||||
logger.info(
|
||||
f"Expanded CIDR '{cidr}' for site {site_id} ('{site.name}'): "
|
||||
f"added {ip_count} IPs, skipped {len(ips_skipped)} duplicates"
|
||||
)
|
||||
|
||||
return {
|
||||
'cidr': cidr,
|
||||
'ip_count': ip_count,
|
||||
'ips_added': ips_added,
|
||||
'ips_skipped': ips_skipped
|
||||
}
|
||||
|
||||
def bulk_add_ips_from_list(self, site_id: int, ip_list: List[str],
|
||||
expected_ping: Optional[bool] = None,
|
||||
expected_tcp_ports: Optional[List[int]] = None,
|
||||
expected_udp_ports: Optional[List[int]] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Add multiple IPs from a list (e.g., from CSV/text import).
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
ip_list: List of IP addresses as strings
|
||||
expected_ping: Expected ping response for all IPs
|
||||
expected_tcp_ports: List of expected TCP ports for all IPs
|
||||
expected_udp_ports: List of expected UDP ports for all IPs
|
||||
|
||||
Returns:
|
||||
Dictionary with:
|
||||
- ip_count: Number of IPs successfully created
|
||||
- ips_added: List of IP addresses created
|
||||
- ips_skipped: List of IPs that already existed
|
||||
- errors: List of validation errors {ip: error_message}
|
||||
|
||||
Raises:
|
||||
ValueError: If site not found
|
||||
"""
|
||||
site = self.db.query(Site).filter(Site.id == site_id).first()
|
||||
if not site:
|
||||
raise ValueError(f"Site with id {site_id} not found")
|
||||
|
||||
ips_added = []
|
||||
ips_skipped = []
|
||||
errors = []
|
||||
|
||||
for ip_str in ip_list:
|
||||
ip_str = ip_str.strip()
|
||||
if not ip_str:
|
||||
continue # Skip empty lines
|
||||
|
||||
# Validate IP format
|
||||
try:
|
||||
ipaddress.ip_address(ip_str)
|
||||
except ValueError as e:
|
||||
errors.append({'ip': ip_str, 'error': f"Invalid IP address: {str(e)}"})
|
||||
continue
|
||||
|
||||
# Check for duplicate (across all IPs in the site)
|
||||
existing = (
|
||||
self.db.query(SiteIP)
|
||||
.filter(SiteIP.site_id == site_id, SiteIP.ip_address == ip_str)
|
||||
.first()
|
||||
)
|
||||
if existing:
|
||||
ips_skipped.append(ip_str)
|
||||
continue
|
||||
|
||||
# Create IP record
|
||||
try:
|
||||
ip_obj = SiteIP(
|
||||
site_id=site_id,
|
||||
ip_address=ip_str,
|
||||
expected_ping=expected_ping,
|
||||
expected_tcp_ports=json.dumps(expected_tcp_ports or []),
|
||||
expected_udp_ports=json.dumps(expected_udp_ports or []),
|
||||
created_at=datetime.utcnow()
|
||||
)
|
||||
|
||||
self.db.add(ip_obj)
|
||||
ips_added.append(ip_str)
|
||||
except Exception as e:
|
||||
errors.append({'ip': ip_str, 'error': f"Database error: {str(e)}"})
|
||||
|
||||
site.updated_at = datetime.utcnow()
|
||||
self.db.commit()
|
||||
|
||||
logger.info(
|
||||
f"Bulk added {len(ips_added)} IPs to site {site_id} ('{site.name}'), "
|
||||
f"skipped {len(ips_skipped)} duplicates, {len(errors)} errors"
|
||||
)
|
||||
|
||||
return {
|
||||
'ip_count': len(ips_added),
|
||||
'ips_added': ips_added,
|
||||
'ips_skipped': ips_skipped,
|
||||
'errors': errors
|
||||
}
|
||||
|
||||
def add_standalone_ip(self, site_id: int, ip_address: str,
|
||||
expected_ping: Optional[bool] = None,
|
||||
expected_tcp_ports: Optional[List[int]] = None,
|
||||
expected_udp_ports: Optional[List[int]] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Add a standalone IP (without a CIDR parent) to a site.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
ip_address: IP address to add
|
||||
expected_ping: Expected ping response
|
||||
expected_tcp_ports: List of expected TCP ports
|
||||
expected_udp_ports: List of expected UDP ports
|
||||
|
||||
Returns:
|
||||
Dictionary with IP data
|
||||
|
||||
Raises:
|
||||
ValueError: If site not found, IP is invalid, or already exists
|
||||
"""
|
||||
site = self.db.query(Site).filter(Site.id == site_id).first()
|
||||
if not site:
|
||||
raise ValueError(f"Site with id {site_id} not found")
|
||||
|
||||
# Validate IP format
|
||||
try:
|
||||
ipaddress.ip_address(ip_address)
|
||||
except ValueError as e:
|
||||
raise ValueError(f"Invalid IP address '{ip_address}': {str(e)}")
|
||||
|
||||
# Check for duplicate (across all IPs in the site)
|
||||
existing = (
|
||||
self.db.query(SiteIP)
|
||||
.filter(SiteIP.site_id == site_id, SiteIP.ip_address == ip_address)
|
||||
.first()
|
||||
)
|
||||
if existing:
|
||||
raise ValueError(f"IP '{ip_address}' already exists in this site")
|
||||
|
||||
# Create IP
|
||||
ip_obj = SiteIP(
|
||||
site_id=site_id,
|
||||
ip_address=ip_address,
|
||||
expected_ping=expected_ping,
|
||||
expected_tcp_ports=json.dumps(expected_tcp_ports or []),
|
||||
expected_udp_ports=json.dumps(expected_udp_ports or []),
|
||||
created_at=datetime.utcnow()
|
||||
)
|
||||
|
||||
self.db.add(ip_obj)
|
||||
site.updated_at = datetime.utcnow()
|
||||
self.db.commit()
|
||||
self.db.refresh(ip_obj)
|
||||
|
||||
logger.info(f"Added IP '{ip_address}' to site {site_id} ('{site.name}')")
|
||||
|
||||
return self._ip_to_dict(ip_obj)
|
||||
|
||||
def update_ip_settings(self, site_id: int, ip_id: int,
|
||||
expected_ping: Optional[bool] = None,
|
||||
expected_tcp_ports: Optional[List[int]] = None,
|
||||
expected_udp_ports: Optional[List[int]] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Update settings for an individual IP.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
ip_id: IP ID to update
|
||||
expected_ping: New ping expectation (if provided)
|
||||
expected_tcp_ports: New TCP ports expectation (if provided)
|
||||
expected_udp_ports: New UDP ports expectation (if provided)
|
||||
|
||||
Returns:
|
||||
Dictionary with updated IP data
|
||||
|
||||
Raises:
|
||||
ValueError: If IP not found
|
||||
"""
|
||||
ip_obj = (
|
||||
self.db.query(SiteIP)
|
||||
.filter(SiteIP.id == ip_id, SiteIP.site_id == site_id)
|
||||
.first()
|
||||
)
|
||||
if not ip_obj:
|
||||
raise ValueError(f"IP with id {ip_id} not found for site {site_id}")
|
||||
|
||||
# Update settings if provided
|
||||
if expected_ping is not None:
|
||||
ip_obj.expected_ping = expected_ping
|
||||
if expected_tcp_ports is not None:
|
||||
ip_obj.expected_tcp_ports = json.dumps(expected_tcp_ports)
|
||||
if expected_udp_ports is not None:
|
||||
ip_obj.expected_udp_ports = json.dumps(expected_udp_ports)
|
||||
|
||||
self.db.commit()
|
||||
self.db.refresh(ip_obj)
|
||||
|
||||
logger.info(f"Updated settings for IP '{ip_obj.ip_address}' in site {site_id}")
|
||||
|
||||
return self._ip_to_dict(ip_obj)
|
||||
|
||||
def remove_ip(self, site_id: int, ip_id: int) -> None:
|
||||
"""
|
||||
Remove an IP from a site.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
ip_id: IP ID to remove
|
||||
|
||||
Raises:
|
||||
ValueError: If IP not found
|
||||
"""
|
||||
ip_obj = (
|
||||
self.db.query(SiteIP)
|
||||
.filter(SiteIP.id == ip_id, SiteIP.site_id == site_id)
|
||||
.first()
|
||||
)
|
||||
if not ip_obj:
|
||||
raise ValueError(f"IP with id {ip_id} not found for site {site_id}")
|
||||
|
||||
ip_address = ip_obj.ip_address
|
||||
self.db.delete(ip_obj)
|
||||
self.db.commit()
|
||||
|
||||
logger.info(f"Removed IP '{ip_address}' from site {site_id}")
|
||||
|
||||
def list_ips(self, site_id: int, page: int = 1, per_page: int = 50) -> PaginatedResult:
|
||||
"""
|
||||
List IPs in a site with pagination.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
page: Page number (1-indexed)
|
||||
per_page: Number of items per page
|
||||
|
||||
Returns:
|
||||
PaginatedResult with IP data
|
||||
"""
|
||||
query = (
|
||||
self.db.query(SiteIP)
|
||||
.filter(SiteIP.site_id == site_id)
|
||||
.order_by(SiteIP.ip_address)
|
||||
)
|
||||
|
||||
return paginate(query, page, per_page, self._ip_to_dict)
|
||||
|
||||
def get_scan_usage(self, site_id: int) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Get list of scans that use this site.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
|
||||
Returns:
|
||||
List of scan dictionaries
|
||||
"""
|
||||
from web.models import Scan # Import here to avoid circular dependency
|
||||
|
||||
associations = (
|
||||
self.db.query(ScanSiteAssociation)
|
||||
.options(joinedload(ScanSiteAssociation.scan))
|
||||
.filter(ScanSiteAssociation.site_id == site_id)
|
||||
.all()
|
||||
)
|
||||
|
||||
return [
|
||||
{
|
||||
'id': assoc.scan.id,
|
||||
'title': assoc.scan.title,
|
||||
'timestamp': assoc.scan.timestamp.isoformat() if assoc.scan.timestamp else None,
|
||||
'status': assoc.scan.status
|
||||
}
|
||||
for assoc in associations
|
||||
]
|
||||
|
||||
# Private helper methods
|
||||
|
||||
def _expand_cidr_to_ips(self, site_id: int,
|
||||
network: ipaddress.IPv4Network | ipaddress.IPv6Network,
|
||||
expected_ping: Optional[bool],
|
||||
expected_tcp_ports: List[int],
|
||||
expected_udp_ports: List[int]) -> tuple[int, List[str], List[str]]:
|
||||
"""
|
||||
Expand a CIDR to individual IP addresses.
|
||||
|
||||
Args:
|
||||
site_id: Site ID
|
||||
network: ipaddress network object
|
||||
expected_ping: Default ping setting for all IPs
|
||||
expected_tcp_ports: Default TCP ports for all IPs
|
||||
expected_udp_ports: Default UDP ports for all IPs
|
||||
|
||||
Returns:
|
||||
Tuple of (count of IPs created, list of IPs added, list of IPs skipped)
|
||||
"""
|
||||
ip_count = 0
|
||||
ips_added = []
|
||||
ips_skipped = []
|
||||
|
||||
# For /32 or /128 (single host), use the network address
|
||||
# For larger ranges, use hosts() to exclude network/broadcast addresses
|
||||
if network.num_addresses == 1:
|
||||
ip_list = [network.network_address]
|
||||
elif network.num_addresses == 2:
|
||||
# For /31 networks (point-to-point), both addresses are usable
|
||||
ip_list = [network.network_address, network.broadcast_address]
|
||||
else:
|
||||
# Use hosts() to get usable IPs (excludes network and broadcast)
|
||||
ip_list = list(network.hosts())
|
||||
|
||||
for ip in ip_list:
|
||||
ip_str = str(ip)
|
||||
|
||||
# Check for duplicate
|
||||
existing = (
|
||||
self.db.query(SiteIP)
|
||||
.filter(SiteIP.site_id == site_id, SiteIP.ip_address == ip_str)
|
||||
.first()
|
||||
)
|
||||
if existing:
|
||||
ips_skipped.append(ip_str)
|
||||
continue
|
||||
|
||||
# Create SiteIP entry
|
||||
ip_obj = SiteIP(
|
||||
site_id=site_id,
|
||||
ip_address=ip_str,
|
||||
expected_ping=expected_ping,
|
||||
expected_tcp_ports=json.dumps(expected_tcp_ports),
|
||||
expected_udp_ports=json.dumps(expected_udp_ports),
|
||||
created_at=datetime.utcnow()
|
||||
)
|
||||
|
||||
self.db.add(ip_obj)
|
||||
ips_added.append(ip_str)
|
||||
ip_count += 1
|
||||
|
||||
return ip_count, ips_added, ips_skipped
|
||||
|
||||
def _site_to_dict(self, site: Site) -> Dict[str, Any]:
|
||||
"""Convert Site model to dictionary."""
|
||||
# Count IPs for this site
|
||||
ip_count = (
|
||||
self.db.query(func.count(SiteIP.id))
|
||||
.filter(SiteIP.site_id == site.id)
|
||||
.scalar() or 0
|
||||
)
|
||||
|
||||
return {
|
||||
'id': site.id,
|
||||
'name': site.name,
|
||||
'description': site.description,
|
||||
'created_at': site.created_at.isoformat() if site.created_at else None,
|
||||
'updated_at': site.updated_at.isoformat() if site.updated_at else None,
|
||||
'ip_count': ip_count
|
||||
}
|
||||
|
||||
def _ip_to_dict(self, ip: SiteIP) -> Dict[str, Any]:
|
||||
"""Convert SiteIP model to dictionary."""
|
||||
return {
|
||||
'id': ip.id,
|
||||
'site_id': ip.site_id,
|
||||
'ip_address': ip.ip_address,
|
||||
'expected_ping': ip.expected_ping,
|
||||
'expected_tcp_ports': json.loads(ip.expected_tcp_ports) if ip.expected_tcp_ports else [],
|
||||
'expected_udp_ports': json.loads(ip.expected_udp_ports) if ip.expected_udp_ports else [],
|
||||
'created_at': ip.created_at.isoformat() if ip.created_at else None
|
||||
}
|
||||
294
app/web/services/template_service.py
Normal file
294
app/web/services/template_service.py
Normal file
@@ -0,0 +1,294 @@
|
||||
"""
|
||||
Webhook Template Service
|
||||
|
||||
Provides Jinja2 template rendering for webhook payloads with a sandboxed
|
||||
environment and comprehensive context building from scan/alert/rule data.
|
||||
"""
|
||||
|
||||
from jinja2 import Environment, BaseLoader, TemplateError, meta
|
||||
from jinja2.sandbox import SandboxedEnvironment
|
||||
import json
|
||||
from typing import Dict, Any, Optional, Tuple
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
class TemplateService:
|
||||
"""
|
||||
Service for rendering webhook templates safely using Jinja2.
|
||||
|
||||
Features:
|
||||
- Sandboxed Jinja2 environment to prevent code execution
|
||||
- Rich context with alert, scan, rule, service, cert data
|
||||
- Support for both JSON and text output formats
|
||||
- Template validation and error handling
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize the sandboxed Jinja2 environment."""
|
||||
self.env = SandboxedEnvironment(
|
||||
loader=BaseLoader(),
|
||||
autoescape=False, # We control the output format
|
||||
trim_blocks=True,
|
||||
lstrip_blocks=True
|
||||
)
|
||||
|
||||
# Add custom filters
|
||||
self.env.filters['isoformat'] = self._isoformat_filter
|
||||
|
||||
def _isoformat_filter(self, value):
|
||||
"""Custom filter to convert datetime to ISO format."""
|
||||
if isinstance(value, datetime):
|
||||
return value.isoformat()
|
||||
return str(value)
|
||||
|
||||
def build_context(
|
||||
self,
|
||||
alert,
|
||||
scan,
|
||||
rule,
|
||||
app_name: str = "SneakyScanner",
|
||||
app_version: str = "1.0.0",
|
||||
app_url: str = "https://github.com/sneakygeek/SneakyScan"
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Build the template context from alert, scan, and rule objects.
|
||||
|
||||
Args:
|
||||
alert: Alert model instance
|
||||
scan: Scan model instance
|
||||
rule: AlertRule model instance
|
||||
app_name: Application name
|
||||
app_version: Application version
|
||||
app_url: Application repository URL
|
||||
|
||||
Returns:
|
||||
Dictionary with all available template variables
|
||||
"""
|
||||
context = {
|
||||
"alert": {
|
||||
"id": alert.id,
|
||||
"type": alert.alert_type,
|
||||
"severity": alert.severity,
|
||||
"message": alert.message,
|
||||
"ip_address": alert.ip_address,
|
||||
"port": alert.port,
|
||||
"acknowledged": alert.acknowledged,
|
||||
"acknowledged_at": alert.acknowledged_at,
|
||||
"acknowledged_by": alert.acknowledged_by,
|
||||
"created_at": alert.created_at,
|
||||
"email_sent": alert.email_sent,
|
||||
"email_sent_at": alert.email_sent_at,
|
||||
"webhook_sent": alert.webhook_sent,
|
||||
"webhook_sent_at": alert.webhook_sent_at,
|
||||
},
|
||||
"scan": {
|
||||
"id": scan.id,
|
||||
"title": scan.title,
|
||||
"timestamp": scan.timestamp,
|
||||
"duration": scan.duration,
|
||||
"status": scan.status,
|
||||
"config_id": scan.config_id,
|
||||
"triggered_by": scan.triggered_by,
|
||||
"started_at": scan.started_at,
|
||||
"completed_at": scan.completed_at,
|
||||
"error_message": scan.error_message,
|
||||
},
|
||||
"rule": {
|
||||
"id": rule.id,
|
||||
"name": rule.name,
|
||||
"type": rule.rule_type,
|
||||
"threshold": rule.threshold,
|
||||
"severity": rule.severity,
|
||||
"enabled": rule.enabled,
|
||||
},
|
||||
"app": {
|
||||
"name": app_name,
|
||||
"version": app_version,
|
||||
"url": app_url,
|
||||
},
|
||||
"timestamp": datetime.utcnow(),
|
||||
}
|
||||
|
||||
# Add service information if available (for service-related alerts)
|
||||
# This would require additional context from the caller
|
||||
# For now, we'll add placeholder support
|
||||
context["service"] = None
|
||||
context["cert"] = None
|
||||
context["tls"] = None
|
||||
|
||||
return context
|
||||
|
||||
def render(
|
||||
self,
|
||||
template_string: str,
|
||||
context: Dict[str, Any],
|
||||
template_format: str = 'json'
|
||||
) -> Tuple[str, Optional[str]]:
|
||||
"""
|
||||
Render a template with the given context.
|
||||
|
||||
Args:
|
||||
template_string: The Jinja2 template string
|
||||
context: Template context dictionary
|
||||
template_format: Output format ('json' or 'text')
|
||||
|
||||
Returns:
|
||||
Tuple of (rendered_output, error_message)
|
||||
- If successful: (rendered_string, None)
|
||||
- If failed: (None, error_message)
|
||||
"""
|
||||
try:
|
||||
template = self.env.from_string(template_string)
|
||||
rendered = template.render(context)
|
||||
|
||||
# For JSON format, validate that the output is valid JSON
|
||||
if template_format == 'json':
|
||||
try:
|
||||
# Parse to validate JSON structure
|
||||
json.loads(rendered)
|
||||
except json.JSONDecodeError as e:
|
||||
return None, f"Template rendered invalid JSON: {str(e)}"
|
||||
|
||||
return rendered, None
|
||||
|
||||
except TemplateError as e:
|
||||
return None, f"Template rendering error: {str(e)}"
|
||||
except Exception as e:
|
||||
return None, f"Unexpected error rendering template: {str(e)}"
|
||||
|
||||
def validate_template(
|
||||
self,
|
||||
template_string: str,
|
||||
template_format: str = 'json'
|
||||
) -> Tuple[bool, Optional[str]]:
|
||||
"""
|
||||
Validate a template without rendering it.
|
||||
|
||||
Args:
|
||||
template_string: The Jinja2 template string to validate
|
||||
template_format: Expected output format ('json' or 'text')
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid, error_message)
|
||||
- If valid: (True, None)
|
||||
- If invalid: (False, error_message)
|
||||
"""
|
||||
try:
|
||||
# Parse the template to check syntax
|
||||
self.env.parse(template_string)
|
||||
|
||||
# For JSON templates, check if it looks like valid JSON structure
|
||||
# (this is a basic check - full validation happens during render)
|
||||
if template_format == 'json':
|
||||
# Just check for basic JSON structure markers
|
||||
stripped = template_string.strip()
|
||||
if not (stripped.startswith('{') or stripped.startswith('[')):
|
||||
return False, "JSON template must start with { or ["
|
||||
|
||||
return True, None
|
||||
|
||||
except TemplateError as e:
|
||||
return False, f"Template syntax error: {str(e)}"
|
||||
except Exception as e:
|
||||
return False, f"Template validation error: {str(e)}"
|
||||
|
||||
def get_template_variables(self, template_string: str) -> set:
|
||||
"""
|
||||
Extract all variables used in a template.
|
||||
|
||||
Args:
|
||||
template_string: The Jinja2 template string
|
||||
|
||||
Returns:
|
||||
Set of variable names used in the template
|
||||
"""
|
||||
try:
|
||||
ast = self.env.parse(template_string)
|
||||
return meta.find_undeclared_variables(ast)
|
||||
except Exception:
|
||||
return set()
|
||||
|
||||
def render_test_payload(
|
||||
self,
|
||||
template_string: str,
|
||||
template_format: str = 'json'
|
||||
) -> Tuple[str, Optional[str]]:
|
||||
"""
|
||||
Render a template with sample/test data for preview purposes.
|
||||
|
||||
Args:
|
||||
template_string: The Jinja2 template string
|
||||
template_format: Output format ('json' or 'text')
|
||||
|
||||
Returns:
|
||||
Tuple of (rendered_output, error_message)
|
||||
"""
|
||||
# Create sample context data
|
||||
sample_context = {
|
||||
"alert": {
|
||||
"id": 123,
|
||||
"type": "unexpected_port",
|
||||
"severity": "warning",
|
||||
"message": "Unexpected port 8080 found open on 192.168.1.100",
|
||||
"ip_address": "192.168.1.100",
|
||||
"port": 8080,
|
||||
"acknowledged": False,
|
||||
"acknowledged_at": None,
|
||||
"acknowledged_by": None,
|
||||
"created_at": datetime.utcnow(),
|
||||
"email_sent": False,
|
||||
"email_sent_at": None,
|
||||
"webhook_sent": False,
|
||||
"webhook_sent_at": None,
|
||||
},
|
||||
"scan": {
|
||||
"id": 456,
|
||||
"title": "Production Infrastructure Scan",
|
||||
"timestamp": datetime.utcnow(),
|
||||
"duration": 125.5,
|
||||
"status": "completed",
|
||||
"config_id": 1,
|
||||
"triggered_by": "schedule",
|
||||
"started_at": datetime.utcnow(),
|
||||
"completed_at": datetime.utcnow(),
|
||||
"error_message": None,
|
||||
},
|
||||
"rule": {
|
||||
"id": 789,
|
||||
"name": "Unexpected Port Detection",
|
||||
"type": "unexpected_port",
|
||||
"threshold": None,
|
||||
"severity": "warning",
|
||||
"enabled": True,
|
||||
},
|
||||
"service": {
|
||||
"name": "http",
|
||||
"product": "nginx",
|
||||
"version": "1.20.0",
|
||||
},
|
||||
"cert": {
|
||||
"subject": "CN=example.com",
|
||||
"issuer": "CN=Let's Encrypt Authority X3",
|
||||
"days_until_expiry": 15,
|
||||
},
|
||||
"app": {
|
||||
"name": "SneakyScanner",
|
||||
"version": "1.0.0-phase5",
|
||||
"url": "https://github.com/sneakygeek/SneakyScan",
|
||||
},
|
||||
"timestamp": datetime.utcnow(),
|
||||
}
|
||||
|
||||
return self.render(template_string, sample_context, template_format)
|
||||
|
||||
|
||||
# Singleton instance
|
||||
_template_service = None
|
||||
|
||||
|
||||
def get_template_service() -> TemplateService:
|
||||
"""Get the singleton TemplateService instance."""
|
||||
global _template_service
|
||||
if _template_service is None:
|
||||
_template_service = TemplateService()
|
||||
return _template_service
|
||||
566
app/web/services/webhook_service.py
Normal file
566
app/web/services/webhook_service.py
Normal file
@@ -0,0 +1,566 @@
|
||||
"""
|
||||
Webhook Service Module
|
||||
|
||||
Handles webhook delivery for alert notifications with retry logic,
|
||||
authentication support, and comprehensive logging.
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
import time
|
||||
from datetime import datetime, timezone
|
||||
from typing import List, Dict, Optional, Any, Tuple
|
||||
from sqlalchemy.orm import Session
|
||||
import requests
|
||||
from requests.auth import HTTPBasicAuth
|
||||
|
||||
from cryptography.fernet import Fernet
|
||||
import os
|
||||
|
||||
from ..models import Webhook, WebhookDeliveryLog, Alert, AlertRule, Scan
|
||||
from .template_service import get_template_service
|
||||
from ..config import APP_NAME, APP_VERSION, REPO_URL
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class WebhookService:
|
||||
"""
|
||||
Service for webhook delivery and management.
|
||||
|
||||
Handles queuing webhook deliveries, executing HTTP requests with
|
||||
authentication, retry logic, and logging delivery attempts.
|
||||
"""
|
||||
|
||||
def __init__(self, db_session: Session, encryption_key: Optional[bytes] = None):
|
||||
"""
|
||||
Initialize webhook service.
|
||||
|
||||
Args:
|
||||
db_session: SQLAlchemy database session
|
||||
encryption_key: Fernet encryption key for auth_token encryption
|
||||
"""
|
||||
self.db = db_session
|
||||
self._encryption_key = encryption_key or self._get_encryption_key()
|
||||
self._cipher = Fernet(self._encryption_key) if self._encryption_key else None
|
||||
|
||||
def _get_encryption_key(self) -> Optional[bytes]:
|
||||
"""
|
||||
Get encryption key from environment or database.
|
||||
|
||||
Returns:
|
||||
Fernet encryption key or None if not available
|
||||
"""
|
||||
# Try environment variable first
|
||||
key_str = os.environ.get('SNEAKYSCANNER_ENCRYPTION_KEY')
|
||||
if key_str:
|
||||
return key_str.encode()
|
||||
|
||||
# Try to get from settings (would need to query Setting table)
|
||||
# For now, generate a temporary key if none exists
|
||||
try:
|
||||
return Fernet.generate_key()
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not generate encryption key: {e}")
|
||||
return None
|
||||
|
||||
def _encrypt_value(self, value: str) -> str:
|
||||
"""Encrypt a string value."""
|
||||
if not self._cipher:
|
||||
return value # Return plain text if encryption not available
|
||||
return self._cipher.encrypt(value.encode()).decode()
|
||||
|
||||
def _decrypt_value(self, encrypted_value: str) -> str:
|
||||
"""Decrypt an encrypted string value."""
|
||||
if not self._cipher:
|
||||
return encrypted_value # Return as-is if encryption not available
|
||||
try:
|
||||
return self._cipher.decrypt(encrypted_value.encode()).decode()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to decrypt value: {e}")
|
||||
return encrypted_value
|
||||
|
||||
def get_matching_webhooks(self, alert: Alert) -> List[Webhook]:
|
||||
"""
|
||||
Get all enabled webhooks that match an alert's type and severity.
|
||||
|
||||
Args:
|
||||
alert: Alert object to match against
|
||||
|
||||
Returns:
|
||||
List of matching Webhook objects
|
||||
"""
|
||||
# Get all enabled webhooks
|
||||
webhooks = self.db.query(Webhook).filter(Webhook.enabled == True).all()
|
||||
|
||||
matching_webhooks = []
|
||||
for webhook in webhooks:
|
||||
# Check if webhook matches alert type filter
|
||||
if webhook.alert_types:
|
||||
try:
|
||||
alert_types = json.loads(webhook.alert_types)
|
||||
if alert.alert_type not in alert_types:
|
||||
continue # Skip if alert type doesn't match
|
||||
except json.JSONDecodeError:
|
||||
logger.warning(f"Invalid alert_types JSON for webhook {webhook.id}")
|
||||
continue
|
||||
|
||||
# Check if webhook matches severity filter
|
||||
if webhook.severity_filter:
|
||||
try:
|
||||
severity_filter = json.loads(webhook.severity_filter)
|
||||
if alert.severity not in severity_filter:
|
||||
continue # Skip if severity doesn't match
|
||||
except json.JSONDecodeError:
|
||||
logger.warning(f"Invalid severity_filter JSON for webhook {webhook.id}")
|
||||
continue
|
||||
|
||||
matching_webhooks.append(webhook)
|
||||
|
||||
logger.info(f"Found {len(matching_webhooks)} matching webhooks for alert {alert.id}")
|
||||
return matching_webhooks
|
||||
|
||||
def queue_webhook_delivery(self, webhook_id: int, alert_id: int, scheduler_service=None) -> bool:
|
||||
"""
|
||||
Queue a webhook delivery for async execution via APScheduler.
|
||||
|
||||
Args:
|
||||
webhook_id: ID of webhook to deliver
|
||||
alert_id: ID of alert to send
|
||||
scheduler_service: SchedulerService instance (if None, deliver synchronously)
|
||||
|
||||
Returns:
|
||||
True if queued successfully, False otherwise
|
||||
"""
|
||||
if scheduler_service and scheduler_service.scheduler:
|
||||
try:
|
||||
# Import here to avoid circular dependency
|
||||
from web.jobs.webhook_job import execute_webhook_delivery
|
||||
|
||||
# Schedule immediate execution
|
||||
scheduler_service.scheduler.add_job(
|
||||
execute_webhook_delivery,
|
||||
args=[webhook_id, alert_id, scheduler_service.db_url],
|
||||
id=f"webhook_{webhook_id}_{alert_id}_{int(time.time())}",
|
||||
replace_existing=False
|
||||
)
|
||||
logger.info(f"Queued webhook {webhook_id} for alert {alert_id}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to queue webhook delivery: {e}")
|
||||
# Fall back to synchronous delivery
|
||||
return self.deliver_webhook(webhook_id, alert_id)
|
||||
else:
|
||||
# No scheduler available, deliver synchronously
|
||||
logger.info(f"No scheduler available, delivering webhook {webhook_id} synchronously")
|
||||
return self.deliver_webhook(webhook_id, alert_id)
|
||||
|
||||
def deliver_webhook(self, webhook_id: int, alert_id: int, attempt_number: int = 1) -> bool:
|
||||
"""
|
||||
Deliver a webhook with retry logic.
|
||||
|
||||
Args:
|
||||
webhook_id: ID of webhook to deliver
|
||||
alert_id: ID of alert to send
|
||||
attempt_number: Current attempt number (for retries)
|
||||
|
||||
Returns:
|
||||
True if delivered successfully, False otherwise
|
||||
"""
|
||||
# Get webhook and alert
|
||||
webhook = self.db.query(Webhook).filter(Webhook.id == webhook_id).first()
|
||||
if not webhook:
|
||||
logger.error(f"Webhook {webhook_id} not found")
|
||||
return False
|
||||
|
||||
alert = self.db.query(Alert).filter(Alert.id == alert_id).first()
|
||||
if not alert:
|
||||
logger.error(f"Alert {alert_id} not found")
|
||||
return False
|
||||
|
||||
logger.info(f"Delivering webhook {webhook_id} for alert {alert_id} (attempt {attempt_number}/{webhook.retry_count})")
|
||||
|
||||
# Build payload with template support
|
||||
payload, content_type = self._build_payload(webhook, alert)
|
||||
|
||||
# Prepare headers
|
||||
headers = {'Content-Type': content_type}
|
||||
|
||||
# Add custom headers if provided
|
||||
if webhook.custom_headers:
|
||||
try:
|
||||
custom_headers = json.loads(webhook.custom_headers)
|
||||
headers.update(custom_headers)
|
||||
except json.JSONDecodeError:
|
||||
logger.warning(f"Invalid custom_headers JSON for webhook {webhook_id}")
|
||||
|
||||
# Prepare authentication
|
||||
auth = None
|
||||
if webhook.auth_type == 'bearer' and webhook.auth_token:
|
||||
decrypted_token = self._decrypt_value(webhook.auth_token)
|
||||
headers['Authorization'] = f'Bearer {decrypted_token}'
|
||||
elif webhook.auth_type == 'basic' and webhook.auth_token:
|
||||
# Expecting format: "username:password"
|
||||
decrypted_token = self._decrypt_value(webhook.auth_token)
|
||||
if ':' in decrypted_token:
|
||||
username, password = decrypted_token.split(':', 1)
|
||||
auth = HTTPBasicAuth(username, password)
|
||||
|
||||
# Execute HTTP request
|
||||
try:
|
||||
timeout = webhook.timeout or 10
|
||||
|
||||
# Use appropriate parameter based on payload type
|
||||
if isinstance(payload, dict):
|
||||
# JSON payload
|
||||
response = requests.post(
|
||||
webhook.url,
|
||||
json=payload,
|
||||
headers=headers,
|
||||
auth=auth,
|
||||
timeout=timeout
|
||||
)
|
||||
else:
|
||||
# Text payload
|
||||
response = requests.post(
|
||||
webhook.url,
|
||||
data=payload,
|
||||
headers=headers,
|
||||
auth=auth,
|
||||
timeout=timeout
|
||||
)
|
||||
|
||||
# Log delivery attempt
|
||||
log_entry = WebhookDeliveryLog(
|
||||
webhook_id=webhook_id,
|
||||
alert_id=alert_id,
|
||||
status='success' if response.status_code < 400 else 'failed',
|
||||
response_code=response.status_code,
|
||||
response_body=response.text[:1000], # Limit to 1000 chars
|
||||
error_message=None if response.status_code < 400 else f"HTTP {response.status_code}",
|
||||
attempt_number=attempt_number,
|
||||
delivered_at=datetime.now(timezone.utc)
|
||||
)
|
||||
self.db.add(log_entry)
|
||||
|
||||
# Update alert webhook status if successful
|
||||
if response.status_code < 400:
|
||||
alert.webhook_sent = True
|
||||
alert.webhook_sent_at = datetime.now(timezone.utc)
|
||||
logger.info(f"Webhook {webhook_id} delivered successfully (HTTP {response.status_code})")
|
||||
self.db.commit()
|
||||
return True
|
||||
else:
|
||||
# Failed but got a response
|
||||
logger.warning(f"Webhook {webhook_id} failed with HTTP {response.status_code}")
|
||||
self.db.commit()
|
||||
|
||||
# Retry if attempts remaining
|
||||
if attempt_number < webhook.retry_count:
|
||||
delay = self._calculate_retry_delay(attempt_number)
|
||||
logger.info(f"Retrying webhook {webhook_id} in {delay} seconds")
|
||||
time.sleep(delay)
|
||||
return self.deliver_webhook(webhook_id, alert_id, attempt_number + 1)
|
||||
return False
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
error_msg = f"Request timeout after {timeout} seconds"
|
||||
logger.error(f"Webhook {webhook_id} timeout: {error_msg}")
|
||||
self._log_delivery_failure(webhook_id, alert_id, error_msg, attempt_number)
|
||||
|
||||
except requests.exceptions.ConnectionError as e:
|
||||
error_msg = f"Connection error: {str(e)}"
|
||||
logger.error(f"Webhook {webhook_id} connection error: {error_msg}")
|
||||
self._log_delivery_failure(webhook_id, alert_id, error_msg, attempt_number)
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
error_msg = f"Request error: {str(e)}"
|
||||
logger.error(f"Webhook {webhook_id} request error: {error_msg}")
|
||||
self._log_delivery_failure(webhook_id, alert_id, error_msg, attempt_number)
|
||||
|
||||
except Exception as e:
|
||||
error_msg = f"Unexpected error: {str(e)}"
|
||||
logger.error(f"Webhook {webhook_id} unexpected error: {error_msg}")
|
||||
self._log_delivery_failure(webhook_id, alert_id, error_msg, attempt_number)
|
||||
|
||||
# Retry if attempts remaining
|
||||
if attempt_number < webhook.retry_count:
|
||||
delay = self._calculate_retry_delay(attempt_number)
|
||||
logger.info(f"Retrying webhook {webhook_id} in {delay} seconds")
|
||||
time.sleep(delay)
|
||||
return self.deliver_webhook(webhook_id, alert_id, attempt_number + 1)
|
||||
|
||||
return False
|
||||
|
||||
def _log_delivery_failure(self, webhook_id: int, alert_id: int, error_message: str, attempt_number: int):
|
||||
"""Log a failed delivery attempt."""
|
||||
log_entry = WebhookDeliveryLog(
|
||||
webhook_id=webhook_id,
|
||||
alert_id=alert_id,
|
||||
status='failed',
|
||||
response_code=None,
|
||||
response_body=None,
|
||||
error_message=error_message[:500], # Limit error message length
|
||||
attempt_number=attempt_number,
|
||||
delivered_at=datetime.now(timezone.utc)
|
||||
)
|
||||
self.db.add(log_entry)
|
||||
self.db.commit()
|
||||
|
||||
def _calculate_retry_delay(self, attempt_number: int) -> int:
|
||||
"""
|
||||
Calculate exponential backoff delay for retries.
|
||||
|
||||
Args:
|
||||
attempt_number: Current attempt number
|
||||
|
||||
Returns:
|
||||
Delay in seconds
|
||||
"""
|
||||
# Exponential backoff: 2^attempt seconds (2, 4, 8, 16...)
|
||||
return min(2 ** attempt_number, 60) # Cap at 60 seconds
|
||||
|
||||
def _build_payload(self, webhook: Webhook, alert: Alert) -> Tuple[Any, str]:
|
||||
"""
|
||||
Build payload for webhook delivery using template if configured.
|
||||
|
||||
Args:
|
||||
webhook: Webhook object with optional template configuration
|
||||
alert: Alert object
|
||||
|
||||
Returns:
|
||||
Tuple of (payload, content_type):
|
||||
- payload: Rendered payload (dict for JSON, string for text)
|
||||
- content_type: Content-Type header value
|
||||
"""
|
||||
# Get related scan
|
||||
scan = self.db.query(Scan).filter(Scan.id == alert.scan_id).first()
|
||||
|
||||
# Get related rule
|
||||
rule = self.db.query(AlertRule).filter(AlertRule.id == alert.rule_id).first()
|
||||
|
||||
# If webhook has a custom template, use it
|
||||
if webhook.template:
|
||||
template_service = get_template_service()
|
||||
context = template_service.build_context(
|
||||
alert=alert,
|
||||
scan=scan,
|
||||
rule=rule,
|
||||
app_name=APP_NAME,
|
||||
app_version=APP_VERSION,
|
||||
app_url=REPO_URL
|
||||
)
|
||||
|
||||
rendered, error = template_service.render(
|
||||
webhook.template,
|
||||
context,
|
||||
webhook.template_format or 'json'
|
||||
)
|
||||
|
||||
if error:
|
||||
logger.error(f"Template rendering error for webhook {webhook.id}: {error}")
|
||||
# Fall back to default payload
|
||||
return self._build_default_payload(alert, scan, rule), 'application/json'
|
||||
|
||||
# Determine content type
|
||||
if webhook.content_type_override:
|
||||
content_type = webhook.content_type_override
|
||||
elif webhook.template_format == 'text':
|
||||
content_type = 'text/plain'
|
||||
else:
|
||||
content_type = 'application/json'
|
||||
|
||||
# For JSON format, parse the rendered string back to a dict
|
||||
# For text format, return as string
|
||||
if webhook.template_format == 'json':
|
||||
try:
|
||||
payload = json.loads(rendered)
|
||||
except json.JSONDecodeError:
|
||||
logger.error(f"Failed to parse rendered JSON template for webhook {webhook.id}")
|
||||
return self._build_default_payload(alert, scan, rule), 'application/json'
|
||||
else:
|
||||
payload = rendered
|
||||
|
||||
return payload, content_type
|
||||
|
||||
# No template - use default payload
|
||||
return self._build_default_payload(alert, scan, rule), 'application/json'
|
||||
|
||||
def _build_default_payload(self, alert: Alert, scan: Optional[Scan], rule: Optional[AlertRule]) -> Dict[str, Any]:
|
||||
"""
|
||||
Build default JSON payload for webhook delivery.
|
||||
|
||||
Args:
|
||||
alert: Alert object
|
||||
scan: Scan object (optional)
|
||||
rule: AlertRule object (optional)
|
||||
|
||||
Returns:
|
||||
Dict containing alert details in generic JSON format
|
||||
"""
|
||||
payload = {
|
||||
"event": "alert.created",
|
||||
"alert": {
|
||||
"id": alert.id,
|
||||
"type": alert.alert_type,
|
||||
"severity": alert.severity,
|
||||
"message": alert.message,
|
||||
"ip_address": alert.ip_address,
|
||||
"port": alert.port,
|
||||
"acknowledged": alert.acknowledged,
|
||||
"created_at": alert.created_at.isoformat() if alert.created_at else None
|
||||
},
|
||||
"scan": {
|
||||
"id": scan.id if scan else None,
|
||||
"title": scan.title if scan else None,
|
||||
"timestamp": scan.timestamp.isoformat() if scan and scan.timestamp else None,
|
||||
"status": scan.status if scan else None
|
||||
},
|
||||
"rule": {
|
||||
"id": rule.id if rule else None,
|
||||
"name": rule.name if rule else None,
|
||||
"type": rule.rule_type if rule else None,
|
||||
"threshold": rule.threshold if rule else None
|
||||
}
|
||||
}
|
||||
|
||||
return payload
|
||||
|
||||
def test_webhook(self, webhook_id: int) -> Dict[str, Any]:
|
||||
"""
|
||||
Send a test payload to a webhook.
|
||||
|
||||
Args:
|
||||
webhook_id: ID of webhook to test
|
||||
|
||||
Returns:
|
||||
Dict with test result details
|
||||
"""
|
||||
webhook = self.db.query(Webhook).filter(Webhook.id == webhook_id).first()
|
||||
if not webhook:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Webhook not found',
|
||||
'status_code': None
|
||||
}
|
||||
|
||||
# Build test payload - use template if configured
|
||||
if webhook.template:
|
||||
template_service = get_template_service()
|
||||
rendered, error = template_service.render_test_payload(
|
||||
webhook.template,
|
||||
webhook.template_format or 'json'
|
||||
)
|
||||
|
||||
if error:
|
||||
return {
|
||||
'success': False,
|
||||
'message': f'Template error: {error}',
|
||||
'status_code': None
|
||||
}
|
||||
|
||||
# Determine content type
|
||||
if webhook.content_type_override:
|
||||
content_type = webhook.content_type_override
|
||||
elif webhook.template_format == 'text':
|
||||
content_type = 'text/plain'
|
||||
else:
|
||||
content_type = 'application/json'
|
||||
|
||||
# Parse JSON template
|
||||
if webhook.template_format == 'json':
|
||||
try:
|
||||
payload = json.loads(rendered)
|
||||
except json.JSONDecodeError:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Template rendered invalid JSON',
|
||||
'status_code': None
|
||||
}
|
||||
else:
|
||||
payload = rendered
|
||||
else:
|
||||
# Default test payload
|
||||
payload = {
|
||||
"event": "webhook.test",
|
||||
"message": "This is a test webhook from SneakyScanner",
|
||||
"timestamp": datetime.now(timezone.utc).isoformat(),
|
||||
"webhook": {
|
||||
"id": webhook.id,
|
||||
"name": webhook.name
|
||||
}
|
||||
}
|
||||
content_type = 'application/json'
|
||||
|
||||
# Prepare headers
|
||||
headers = {'Content-Type': content_type}
|
||||
|
||||
if webhook.custom_headers:
|
||||
try:
|
||||
custom_headers = json.loads(webhook.custom_headers)
|
||||
headers.update(custom_headers)
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
|
||||
# Prepare authentication
|
||||
auth = None
|
||||
if webhook.auth_type == 'bearer' and webhook.auth_token:
|
||||
decrypted_token = self._decrypt_value(webhook.auth_token)
|
||||
headers['Authorization'] = f'Bearer {decrypted_token}'
|
||||
elif webhook.auth_type == 'basic' and webhook.auth_token:
|
||||
decrypted_token = self._decrypt_value(webhook.auth_token)
|
||||
if ':' in decrypted_token:
|
||||
username, password = decrypted_token.split(':', 1)
|
||||
auth = HTTPBasicAuth(username, password)
|
||||
|
||||
# Execute test request
|
||||
try:
|
||||
timeout = webhook.timeout or 10
|
||||
|
||||
# Use appropriate parameter based on payload type
|
||||
if isinstance(payload, dict):
|
||||
# JSON payload
|
||||
response = requests.post(
|
||||
webhook.url,
|
||||
json=payload,
|
||||
headers=headers,
|
||||
auth=auth,
|
||||
timeout=timeout
|
||||
)
|
||||
else:
|
||||
# Text payload
|
||||
response = requests.post(
|
||||
webhook.url,
|
||||
data=payload,
|
||||
headers=headers,
|
||||
auth=auth,
|
||||
timeout=timeout
|
||||
)
|
||||
|
||||
return {
|
||||
'success': response.status_code < 400,
|
||||
'message': f'HTTP {response.status_code}',
|
||||
'status_code': response.status_code,
|
||||
'response_body': response.text[:500]
|
||||
}
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
return {
|
||||
'success': False,
|
||||
'message': f'Request timeout after {timeout} seconds',
|
||||
'status_code': None
|
||||
}
|
||||
|
||||
except requests.exceptions.ConnectionError as e:
|
||||
return {
|
||||
'success': False,
|
||||
'message': f'Connection error: {str(e)}',
|
||||
'status_code': None
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {
|
||||
'success': False,
|
||||
'message': f'Error: {str(e)}',
|
||||
'status_code': None
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
505
app/web/templates/alert_rules.html
Normal file
505
app/web/templates/alert_rules.html
Normal file
@@ -0,0 +1,505 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Alert Rules - SneakyScanner{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12 d-flex justify-content-between align-items-center mb-4">
|
||||
<h1>Alert Rules</h1>
|
||||
<div>
|
||||
<a href="{{ url_for('main.alerts') }}" class="btn btn-outline-primary me-2">
|
||||
<i class="bi bi-bell"></i> View Alerts
|
||||
</a>
|
||||
<button class="btn btn-primary" onclick="showCreateRuleModal()">
|
||||
<i class="bi bi-plus-circle"></i> Create Rule
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Rule Statistics -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-6 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Total Rules</h6>
|
||||
<h3 class="mb-0">{{ rules | length }}</h3>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-6 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Active Rules</h6>
|
||||
<h3 class="mb-0 text-success">{{ rules | selectattr('enabled') | list | length }}</h3>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Rules Table -->
|
||||
<div class="row">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Alert Rules Configuration</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
{% if rules %}
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Name</th>
|
||||
<th>Type</th>
|
||||
<th>Severity</th>
|
||||
<th>Threshold</th>
|
||||
<th>Config</th>
|
||||
<th>Notifications</th>
|
||||
<th>Status</th>
|
||||
<th>Actions</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for rule in rules %}
|
||||
<tr>
|
||||
<td>
|
||||
<strong>{{ rule.name or 'Unnamed Rule' }}</strong>
|
||||
<br>
|
||||
<small class="text-muted">ID: {{ rule.id }}</small>
|
||||
</td>
|
||||
<td>
|
||||
<span class="badge bg-secondary">
|
||||
{{ rule.rule_type.replace('_', ' ').title() }}
|
||||
</span>
|
||||
</td>
|
||||
<td>
|
||||
{% if rule.severity == 'critical' %}
|
||||
<span class="badge bg-danger">Critical</span>
|
||||
{% elif rule.severity == 'warning' %}
|
||||
<span class="badge bg-warning">Warning</span>
|
||||
{% else %}
|
||||
<span class="badge bg-info">{{ rule.severity or 'Info' }}</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
{% if rule.threshold %}
|
||||
{% if rule.rule_type == 'cert_expiry' %}
|
||||
{{ rule.threshold }} days
|
||||
{% elif rule.rule_type == 'drift_detection' %}
|
||||
{{ rule.threshold }}%
|
||||
{% else %}
|
||||
{{ rule.threshold }}
|
||||
{% endif %}
|
||||
{% else %}
|
||||
<span class="text-muted">-</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
{% if rule.config %}
|
||||
<small class="text-muted">{{ rule.config.title }}</small>
|
||||
{% else %}
|
||||
<span class="badge bg-primary">All Configs</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
{% if rule.email_enabled %}
|
||||
<i class="bi bi-envelope-fill text-primary" title="Email enabled"></i>
|
||||
{% endif %}
|
||||
{% if rule.webhook_enabled %}
|
||||
<i class="bi bi-send-fill text-primary" title="Webhook enabled"></i>
|
||||
{% endif %}
|
||||
{% if not rule.email_enabled and not rule.webhook_enabled %}
|
||||
<span class="text-muted">None</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
<div class="form-check form-switch">
|
||||
<input class="form-check-input" type="checkbox"
|
||||
id="rule-enabled-{{ rule.id }}"
|
||||
{% if rule.enabled %}checked{% endif %}
|
||||
onchange="toggleRule({{ rule.id }}, this.checked)">
|
||||
<label class="form-check-label" for="rule-enabled-{{ rule.id }}">
|
||||
{% if rule.enabled %}
|
||||
<span class="text-success ms-2">Active</span>
|
||||
{% else %}
|
||||
<span class="text-muted ms-2">Inactive</span>
|
||||
{% endif %}
|
||||
</label>
|
||||
</div>
|
||||
</td>
|
||||
<td>
|
||||
<button class="btn btn-sm btn-outline-primary" onclick="editRule({{ rule.id }})">
|
||||
<i class="bi bi-pencil"></i>
|
||||
</button>
|
||||
<button class="btn btn-sm btn-outline-danger" onclick="deleteRule({{ rule.id }}, '{{ rule.name }}')">
|
||||
<i class="bi bi-trash"></i>
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="text-center py-5 text-muted">
|
||||
<i class="bi bi-bell-slash" style="font-size: 3rem;"></i>
|
||||
<h5 class="mt-3">No alert rules configured</h5>
|
||||
<p>Create alert rules to be notified of important scan findings.</p>
|
||||
<button class="btn btn-primary mt-3" onclick="showCreateRuleModal()">
|
||||
<i class="bi bi-plus-circle"></i> Create Your First Rule
|
||||
</button>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Create/Edit Rule Modal -->
|
||||
<div class="modal fade" id="ruleModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title" id="ruleModalTitle">Create Alert Rule</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form id="ruleForm">
|
||||
<input type="hidden" id="rule-id">
|
||||
|
||||
<div class="row">
|
||||
<div class="col-md-6 mb-3">
|
||||
<label for="rule-name" class="form-label">Rule Name</label>
|
||||
<input type="text" class="form-control" id="rule-name" required>
|
||||
</div>
|
||||
<div class="col-md-6 mb-3">
|
||||
<label for="rule-type" class="form-label">Rule Type</label>
|
||||
<select class="form-select" id="rule-type" required onchange="updateThresholdLabel()">
|
||||
<option value="">Select a type...</option>
|
||||
<option value="unexpected_port">Unexpected Port Detection</option>
|
||||
<option value="drift_detection">Drift Detection</option>
|
||||
<option value="cert_expiry">Certificate Expiry</option>
|
||||
<option value="weak_tls">Weak TLS Version</option>
|
||||
<option value="ping_failed">Ping Failed</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row">
|
||||
<div class="col-md-6 mb-3">
|
||||
<label for="rule-severity" class="form-label">Severity</label>
|
||||
<select class="form-select" id="rule-severity" required>
|
||||
<option value="info">Info</option>
|
||||
<option value="warning" selected>Warning</option>
|
||||
<option value="critical">Critical</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-6 mb-3">
|
||||
<label for="rule-threshold" class="form-label" id="threshold-label">Threshold</label>
|
||||
<input type="number" class="form-control" id="rule-threshold">
|
||||
<small class="form-text text-muted" id="threshold-help">
|
||||
Numeric value that triggers the alert (varies by rule type)
|
||||
</small>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row">
|
||||
<div class="col-md-12 mb-3">
|
||||
<label for="rule-config" class="form-label">Apply to Config (optional)</label>
|
||||
<select class="form-select" id="rule-config">
|
||||
<option value="">All Configs (Apply to all scans)</option>
|
||||
</select>
|
||||
<small class="form-text text-muted" id="config-help-text">
|
||||
Select a specific config to limit this rule, or leave as "All Configs" to apply to all scans
|
||||
</small>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row">
|
||||
<div class="col-md-6">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="checkbox" id="rule-email">
|
||||
<label class="form-check-label" for="rule-email">
|
||||
Send Email Notifications
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="checkbox" id="rule-webhook">
|
||||
<label class="form-check-label" for="rule-webhook">
|
||||
Send Webhook Notifications
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row mt-3">
|
||||
<div class="col-12">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="checkbox" id="rule-enabled" checked>
|
||||
<label class="form-check-label" for="rule-enabled">
|
||||
Enable this rule immediately
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
|
||||
<button type="button" class="btn btn-primary" onclick="saveRule()">
|
||||
<span id="save-rule-text">Create Rule</span>
|
||||
<span id="save-rule-spinner" class="spinner-border spinner-border-sm ms-2" style="display: none;"></span>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
let editingRuleId = null;
|
||||
|
||||
// Load available configs for the dropdown
|
||||
async function loadConfigsForRule() {
|
||||
const selectEl = document.getElementById('rule-config');
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/configs');
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load configurations');
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
const configs = data.configs || [];
|
||||
|
||||
// Preserve the "All Configs" option and current selection
|
||||
const currentValue = selectEl.value;
|
||||
selectEl.innerHTML = '<option value="">All Configs (Apply to all scans)</option>';
|
||||
|
||||
configs.forEach(config => {
|
||||
const option = document.createElement('option');
|
||||
// Store the config ID as the value
|
||||
option.value = config.id;
|
||||
const siteText = config.site_count === 1 ? 'site' : 'sites';
|
||||
option.textContent = `${config.title} (${config.site_count} ${siteText})`;
|
||||
selectEl.appendChild(option);
|
||||
});
|
||||
|
||||
// Restore selection if it was set
|
||||
if (currentValue) {
|
||||
selectEl.value = currentValue;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading configs:', error);
|
||||
}
|
||||
}
|
||||
|
||||
function showCreateRuleModal() {
|
||||
editingRuleId = null;
|
||||
document.getElementById('ruleModalTitle').textContent = 'Create Alert Rule';
|
||||
document.getElementById('save-rule-text').textContent = 'Create Rule';
|
||||
document.getElementById('ruleForm').reset();
|
||||
document.getElementById('rule-enabled').checked = true;
|
||||
|
||||
// Load configs when modal is shown
|
||||
loadConfigsForRule();
|
||||
|
||||
new bootstrap.Modal(document.getElementById('ruleModal')).show();
|
||||
}
|
||||
|
||||
function editRule(ruleId) {
|
||||
editingRuleId = ruleId;
|
||||
document.getElementById('ruleModalTitle').textContent = 'Edit Alert Rule';
|
||||
document.getElementById('save-rule-text').textContent = 'Update Rule';
|
||||
|
||||
// Load configs first, then fetch rule details
|
||||
loadConfigsForRule().then(() => {
|
||||
// Fetch rule details
|
||||
fetch(`/api/alerts/rules`, {
|
||||
headers: {
|
||||
'X-API-Key': localStorage.getItem('api_key') || ''
|
||||
}
|
||||
})
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
const rule = data.rules.find(r => r.id === ruleId);
|
||||
if (rule) {
|
||||
document.getElementById('rule-id').value = rule.id;
|
||||
document.getElementById('rule-name').value = rule.name || '';
|
||||
document.getElementById('rule-type').value = rule.rule_type;
|
||||
document.getElementById('rule-severity').value = rule.severity || 'warning';
|
||||
document.getElementById('rule-threshold').value = rule.threshold || '';
|
||||
document.getElementById('rule-config').value = rule.config_id || '';
|
||||
document.getElementById('rule-email').checked = rule.email_enabled;
|
||||
document.getElementById('rule-webhook').checked = rule.webhook_enabled;
|
||||
document.getElementById('rule-enabled').checked = rule.enabled;
|
||||
|
||||
updateThresholdLabel();
|
||||
new bootstrap.Modal(document.getElementById('ruleModal')).show();
|
||||
}
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error fetching rule:', error);
|
||||
alert('Failed to load rule details');
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
function updateThresholdLabel() {
|
||||
const ruleType = document.getElementById('rule-type').value;
|
||||
const label = document.getElementById('threshold-label');
|
||||
const help = document.getElementById('threshold-help');
|
||||
|
||||
switch(ruleType) {
|
||||
case 'cert_expiry':
|
||||
label.textContent = 'Days Before Expiry';
|
||||
help.textContent = 'Alert when certificate expires within this many days (default: 30)';
|
||||
break;
|
||||
case 'drift_detection':
|
||||
label.textContent = 'Drift Percentage';
|
||||
help.textContent = 'Alert when drift exceeds this percentage (0-100, default: 5)';
|
||||
break;
|
||||
case 'unexpected_port':
|
||||
label.textContent = 'Threshold (optional)';
|
||||
help.textContent = 'Leave blank - this rule alerts on any port not in your config file';
|
||||
break;
|
||||
case 'weak_tls':
|
||||
label.textContent = 'Threshold (optional)';
|
||||
help.textContent = 'Leave blank - this rule alerts on TLS versions below 1.2';
|
||||
break;
|
||||
case 'ping_failed':
|
||||
label.textContent = 'Threshold (optional)';
|
||||
help.textContent = 'Leave blank - this rule alerts when a host fails to respond to ping';
|
||||
break;
|
||||
default:
|
||||
label.textContent = 'Threshold';
|
||||
help.textContent = 'Numeric value that triggers the alert (select a rule type for specific guidance)';
|
||||
}
|
||||
}
|
||||
|
||||
function saveRule() {
|
||||
const name = document.getElementById('rule-name').value;
|
||||
const ruleType = document.getElementById('rule-type').value;
|
||||
const severity = document.getElementById('rule-severity').value;
|
||||
const threshold = document.getElementById('rule-threshold').value;
|
||||
const configId = document.getElementById('rule-config').value;
|
||||
const emailEnabled = document.getElementById('rule-email').checked;
|
||||
const webhookEnabled = document.getElementById('rule-webhook').checked;
|
||||
const enabled = document.getElementById('rule-enabled').checked;
|
||||
|
||||
if (!name || !ruleType) {
|
||||
alert('Please fill in required fields');
|
||||
return;
|
||||
}
|
||||
|
||||
const data = {
|
||||
name: name,
|
||||
rule_type: ruleType,
|
||||
severity: severity,
|
||||
threshold: threshold ? parseInt(threshold) : null,
|
||||
config_id: configId ? parseInt(configId) : null,
|
||||
email_enabled: emailEnabled,
|
||||
webhook_enabled: webhookEnabled,
|
||||
enabled: enabled
|
||||
};
|
||||
|
||||
// Show spinner
|
||||
document.getElementById('save-rule-text').style.display = 'none';
|
||||
document.getElementById('save-rule-spinner').style.display = 'inline-block';
|
||||
|
||||
const url = editingRuleId
|
||||
? `/api/alerts/rules/${editingRuleId}`
|
||||
: '/api/alerts/rules';
|
||||
const method = editingRuleId ? 'PUT' : 'POST';
|
||||
|
||||
fetch(url, {
|
||||
method: method,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'X-API-Key': localStorage.getItem('api_key') || ''
|
||||
},
|
||||
body: JSON.stringify(data)
|
||||
})
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
if (data.status === 'success') {
|
||||
location.reload();
|
||||
} else {
|
||||
alert('Failed to save rule: ' + (data.message || 'Unknown error'));
|
||||
// Hide spinner
|
||||
document.getElementById('save-rule-text').style.display = 'inline';
|
||||
document.getElementById('save-rule-spinner').style.display = 'none';
|
||||
}
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error:', error);
|
||||
alert('Failed to save rule');
|
||||
// Hide spinner
|
||||
document.getElementById('save-rule-text').style.display = 'inline';
|
||||
document.getElementById('save-rule-spinner').style.display = 'none';
|
||||
});
|
||||
}
|
||||
|
||||
function toggleRule(ruleId, enabled) {
|
||||
fetch(`/api/alerts/rules/${ruleId}`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'X-API-Key': localStorage.getItem('api_key') || ''
|
||||
},
|
||||
body: JSON.stringify({ enabled: enabled })
|
||||
})
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
if (data.status !== 'success') {
|
||||
alert('Failed to update rule status');
|
||||
// Revert checkbox
|
||||
document.getElementById(`rule-enabled-${ruleId}`).checked = !enabled;
|
||||
} else {
|
||||
// Update label
|
||||
const label = document.querySelector(`label[for="rule-enabled-${ruleId}"] span`);
|
||||
if (enabled) {
|
||||
label.className = 'text-success';
|
||||
label.textContent = 'Active';
|
||||
} else {
|
||||
label.className = 'text-muted';
|
||||
label.textContent = 'Inactive';
|
||||
}
|
||||
}
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error:', error);
|
||||
alert('Failed to update rule status');
|
||||
// Revert checkbox
|
||||
document.getElementById(`rule-enabled-${ruleId}`).checked = !enabled;
|
||||
});
|
||||
}
|
||||
|
||||
function deleteRule(ruleId, ruleName) {
|
||||
if (!confirm(`Delete alert rule "${ruleName}"? This cannot be undone.`)) {
|
||||
return;
|
||||
}
|
||||
|
||||
fetch(`/api/alerts/rules/${ruleId}`, {
|
||||
method: 'DELETE',
|
||||
headers: {
|
||||
'X-API-Key': localStorage.getItem('api_key') || ''
|
||||
}
|
||||
})
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
if (data.status === 'success') {
|
||||
location.reload();
|
||||
} else {
|
||||
alert('Failed to delete rule: ' + (data.message || 'Unknown error'));
|
||||
}
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error:', error);
|
||||
alert('Failed to delete rule');
|
||||
});
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
303
app/web/templates/alerts.html
Normal file
303
app/web/templates/alerts.html
Normal file
@@ -0,0 +1,303 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Alerts - SneakyScanner{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12 d-flex justify-content-between align-items-center mb-4">
|
||||
<h1>Alert History</h1>
|
||||
<div>
|
||||
<button class="btn btn-success me-2" onclick="acknowledgeAllAlerts()">
|
||||
<i class="bi bi-check-all"></i> Ack All
|
||||
</button>
|
||||
<a href="{{ url_for('main.alert_rules') }}" class="btn btn-primary">
|
||||
<i class="bi bi-gear"></i> Manage Alert Rules
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Alert Statistics -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-3 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Total Alerts</h6>
|
||||
<h3 class="mb-0">{{ pagination.total }}</h3>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-3 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Critical</h6>
|
||||
<h3 class="mb-0 text-danger">
|
||||
{{ alerts | selectattr('severity', 'equalto', 'critical') | list | length }}
|
||||
</h3>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-3 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Warnings</h6>
|
||||
<h3 class="mb-0 text-warning">
|
||||
{{ alerts | selectattr('severity', 'equalto', 'warning') | list | length }}
|
||||
</h3>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-3 mb-3">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h6 class="text-muted mb-2">Unacknowledged</h6>
|
||||
<h3 class="mb-0 text-warning">
|
||||
{{ alerts | rejectattr('acknowledged') | list | length }}
|
||||
</h3>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Filters -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<form method="get" action="{{ url_for('main.alerts') }}" class="row g-3">
|
||||
<div class="col-md-3">
|
||||
<label for="severity-filter" class="form-label">Severity</label>
|
||||
<select class="form-select" id="severity-filter" name="severity">
|
||||
<option value="">All Severities</option>
|
||||
<option value="critical" {% if current_severity == 'critical' %}selected{% endif %}>Critical</option>
|
||||
<option value="warning" {% if current_severity == 'warning' %}selected{% endif %}>Warning</option>
|
||||
<option value="info" {% if current_severity == 'info' %}selected{% endif %}>Info</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<label for="type-filter" class="form-label">Alert Type</label>
|
||||
<select class="form-select" id="type-filter" name="alert_type">
|
||||
<option value="">All Types</option>
|
||||
{% for at in alert_types %}
|
||||
<option value="{{ at }}" {% if current_alert_type == at %}selected{% endif %}>
|
||||
{{ at.replace('_', ' ').title() }}
|
||||
</option>
|
||||
{% endfor %}
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<label for="ack-filter" class="form-label">Acknowledgment</label>
|
||||
<select class="form-select" id="ack-filter" name="acknowledged">
|
||||
<option value="">All</option>
|
||||
<option value="false" {% if current_acknowledged == 'false' %}selected{% endif %}>Unacknowledged</option>
|
||||
<option value="true" {% if current_acknowledged == 'true' %}selected{% endif %}>Acknowledged</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-3 d-flex align-items-end">
|
||||
<button type="submit" class="btn btn-primary w-100">
|
||||
<i class="bi bi-funnel"></i> Apply Filters
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Alerts Table -->
|
||||
<div class="row">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Alerts</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
{% if alerts %}
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th style="width: 100px;">Severity</th>
|
||||
<th>Type</th>
|
||||
<th>Message</th>
|
||||
<th style="width: 120px;">Target</th>
|
||||
<th style="width: 150px;">Scan</th>
|
||||
<th style="width: 150px;">Created</th>
|
||||
<th style="width: 100px;">Status</th>
|
||||
<th style="width: 100px;">Actions</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for alert in alerts %}
|
||||
<tr>
|
||||
<td>
|
||||
{% if alert.severity == 'critical' %}
|
||||
<span class="badge bg-danger">Critical</span>
|
||||
{% elif alert.severity == 'warning' %}
|
||||
<span class="badge bg-warning">Warning</span>
|
||||
{% else %}
|
||||
<span class="badge bg-info">Info</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
<span class="text-muted">{{ alert.alert_type.replace('_', ' ').title() }}</span>
|
||||
</td>
|
||||
<td>
|
||||
{{ alert.message }}
|
||||
</td>
|
||||
<td>
|
||||
{% if alert.ip_address %}
|
||||
<small class="text-muted">
|
||||
{{ alert.ip_address }}{% if alert.port %}:{{ alert.port }}{% endif %}
|
||||
</small>
|
||||
{% else %}
|
||||
<small class="text-muted">-</small>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
<a href="{{ url_for('main.scan_detail', scan_id=alert.scan_id) }}" class="text-decoration-none">
|
||||
Scan #{{ alert.scan_id }}
|
||||
</a>
|
||||
</td>
|
||||
<td>
|
||||
<small class="text-muted">{{ alert.created_at.strftime('%Y-%m-%d %H:%M') }}</small>
|
||||
</td>
|
||||
<td>
|
||||
{% if alert.acknowledged %}
|
||||
<span class="badge bg-success">
|
||||
<i class="bi bi-check-circle"></i> Ack'd
|
||||
</span>
|
||||
{% else %}
|
||||
<span class="badge bg-secondary">New</span>
|
||||
{% endif %}
|
||||
{% if alert.email_sent %}
|
||||
<i class="bi bi-envelope-fill text-muted" title="Email sent"></i>
|
||||
{% endif %}
|
||||
{% if alert.webhook_sent %}
|
||||
<i class="bi bi-send-fill text-muted" title="Webhook sent"></i>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
{% if not alert.acknowledged %}
|
||||
<button class="btn btn-sm btn-outline-success" onclick="acknowledgeAlert({{ alert.id }})">
|
||||
<i class="bi bi-check"></i> Ack
|
||||
</button>
|
||||
{% else %}
|
||||
<small class="text-muted" title="Acknowledged by {{ alert.acknowledged_by }} at {{ alert.acknowledged_at.strftime('%Y-%m-%d %H:%M') }}">
|
||||
By: {{ alert.acknowledged_by }}
|
||||
</small>
|
||||
{% endif %}
|
||||
</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<!-- Pagination -->
|
||||
{% if pagination.pages > 1 %}
|
||||
<nav aria-label="Alert pagination" class="mt-4">
|
||||
<ul class="pagination justify-content-center">
|
||||
<li class="page-item {% if not pagination.has_prev %}disabled{% endif %}">
|
||||
<a class="page-link" href="{{ url_for('main.alerts', page=pagination.prev_num, severity=current_severity, alert_type=current_alert_type, acknowledged=current_acknowledged) }}">
|
||||
Previous
|
||||
</a>
|
||||
</li>
|
||||
|
||||
{% for page_num in pagination.iter_pages(left_edge=1, left_current=1, right_current=2, right_edge=1) %}
|
||||
{% if page_num %}
|
||||
<li class="page-item {% if page_num == pagination.page %}active{% endif %}">
|
||||
<a class="page-link" href="{{ url_for('main.alerts', page=page_num, severity=current_severity, alert_type=current_alert_type, acknowledged=current_acknowledged) }}">
|
||||
{{ page_num }}
|
||||
</a>
|
||||
</li>
|
||||
{% else %}
|
||||
<li class="page-item disabled">
|
||||
<span class="page-link">...</span>
|
||||
</li>
|
||||
{% endif %}
|
||||
{% endfor %}
|
||||
|
||||
<li class="page-item {% if not pagination.has_next %}disabled{% endif %}">
|
||||
<a class="page-link" href="{{ url_for('main.alerts', page=pagination.next_num, severity=current_severity, alert_type=current_alert_type, acknowledged=current_acknowledged) }}">
|
||||
Next
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
<div class="text-center py-5 text-muted">
|
||||
<i class="bi bi-bell-slash" style="font-size: 3rem;"></i>
|
||||
<h5 class="mt-3">No alerts found</h5>
|
||||
<p>Alerts will appear here when scan results trigger alert rules.</p>
|
||||
<a href="{{ url_for('main.alert_rules') }}" class="btn btn-primary mt-3">
|
||||
<i class="bi bi-plus-circle"></i> Configure Alert Rules
|
||||
</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function acknowledgeAlert(alertId) {
|
||||
if (!confirm('Acknowledge this alert?')) {
|
||||
return;
|
||||
}
|
||||
|
||||
fetch(`/api/alerts/${alertId}/acknowledge`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'X-API-Key': localStorage.getItem('api_key') || ''
|
||||
},
|
||||
body: JSON.stringify({
|
||||
acknowledged_by: 'web_user'
|
||||
})
|
||||
})
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
if (data.status === 'success') {
|
||||
location.reload();
|
||||
} else {
|
||||
alert('Failed to acknowledge alert: ' + (data.message || 'Unknown error'));
|
||||
}
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error:', error);
|
||||
alert('Failed to acknowledge alert');
|
||||
});
|
||||
}
|
||||
|
||||
function acknowledgeAllAlerts() {
|
||||
if (!confirm('Acknowledge all unacknowledged alerts?')) {
|
||||
return;
|
||||
}
|
||||
|
||||
fetch('/api/alerts/acknowledge-all', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'X-API-Key': localStorage.getItem('api_key') || ''
|
||||
},
|
||||
body: JSON.stringify({
|
||||
acknowledged_by: 'web_user'
|
||||
})
|
||||
})
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
if (data.status === 'success') {
|
||||
location.reload();
|
||||
} else {
|
||||
alert('Failed to acknowledge alerts: ' + (data.message || 'Unknown error'));
|
||||
}
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error:', error);
|
||||
alert('Failed to acknowledge alerts');
|
||||
});
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>{% block title %}SneakyScanner{% endblock %}</title>
|
||||
<title>{% block title %}{{ app_name }}{% endblock %}</title>
|
||||
|
||||
<!-- Bootstrap 5 CSS -->
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
@@ -34,7 +34,7 @@
|
||||
<nav class="navbar navbar-expand-lg navbar-custom">
|
||||
<div class="container-fluid">
|
||||
<a class="navbar-brand" href="{{ url_for('main.dashboard') }}">
|
||||
SneakyScanner
|
||||
{{ app_name }}
|
||||
</a>
|
||||
<button class="navbar-toggler" type="button" data-bs-toggle="collapse" data-bs-target="#navbarNav">
|
||||
<span class="navbar-toggler-icon"></span>
|
||||
@@ -45,6 +45,16 @@
|
||||
<a class="nav-link {% if request.endpoint == 'main.dashboard' %}active{% endif %}"
|
||||
href="{{ url_for('main.dashboard') }}">Dashboard</a>
|
||||
</li>
|
||||
<li class="nav-item dropdown">
|
||||
<a class="nav-link dropdown-toggle {% if request.endpoint and ('config' in request.endpoint or request.endpoint == 'main.sites') %}active{% endif %}"
|
||||
href="#" id="configsDropdown" role="button" data-bs-toggle="dropdown">
|
||||
Configs
|
||||
</a>
|
||||
<ul class="dropdown-menu" aria-labelledby="configsDropdown">
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.configs') }}">Scan Configs</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.sites') }}">Sites</a></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% if request.endpoint == 'main.scans' %}active{% endif %}"
|
||||
href="{{ url_for('main.scans') }}">Scans</a>
|
||||
@@ -53,12 +63,33 @@
|
||||
<a class="nav-link {% if request.endpoint and 'schedule' in request.endpoint %}active{% endif %}"
|
||||
href="{{ url_for('main.schedules') }}">Schedules</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% if request.endpoint and 'config' in request.endpoint %}active{% endif %}"
|
||||
href="{{ url_for('main.configs') }}">Configs</a>
|
||||
<li class="nav-item dropdown">
|
||||
<a class="nav-link dropdown-toggle {% if request.endpoint and ('alert' in request.endpoint or 'webhook' in request.endpoint) %}active{% endif %}"
|
||||
href="#" id="alertsDropdown" role="button" data-bs-toggle="dropdown">
|
||||
Alerts
|
||||
</a>
|
||||
<ul class="dropdown-menu" aria-labelledby="alertsDropdown">
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.alerts') }}">Alert History</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.alert_rules') }}">Alert Rules</a></li>
|
||||
<li><hr class="dropdown-divider"></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('webhooks.list_webhooks') }}">Webhooks</a></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<form class="d-flex me-3" action="{{ url_for('main.search_ip') }}" method="GET">
|
||||
<input class="form-control form-control-sm me-2" type="search" name="ip"
|
||||
placeholder="Search IP..." aria-label="Search IP" style="width: 150px;">
|
||||
<button class="btn btn-outline-primary btn-sm" type="submit">
|
||||
<i class="bi bi-search"></i>
|
||||
</button>
|
||||
</form>
|
||||
<ul class="navbar-nav">
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% if request.endpoint == 'main.help' %}active{% endif %}"
|
||||
href="{{ url_for('main.help') }}">
|
||||
<i class="bi bi-question-circle"></i> Help
|
||||
</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('auth.logout') }}">Logout</a>
|
||||
</li>
|
||||
@@ -85,10 +116,13 @@
|
||||
|
||||
<div class="footer">
|
||||
<div class="container-fluid">
|
||||
SneakyScanner v1.0 - Phase 3 In Progress
|
||||
<a href="{{ repo_url }}" target="_blank">{{ app_name }}</a> - v{{ app_version }}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Global notification container - always above modals -->
|
||||
<div id="notification-container" style="position: fixed; top: 20px; right: 20px; z-index: 9999; min-width: 300px;"></div>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
|
||||
{% block scripts %}{% endblock %}
|
||||
</body>
|
||||
|
||||
@@ -1,263 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Edit Config - SneakyScanner{% endblock %}
|
||||
|
||||
{% block extra_styles %}
|
||||
<!-- CodeMirror CSS -->
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/codemirror/5.65.2/codemirror.min.css">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/codemirror/5.65.2/theme/dracula.min.css">
|
||||
<style>
|
||||
.config-editor-container {
|
||||
background: #1e293b;
|
||||
border-radius: 8px;
|
||||
padding: 1.5rem;
|
||||
margin-top: 2rem;
|
||||
}
|
||||
|
||||
.editor-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
margin-bottom: 1rem;
|
||||
}
|
||||
|
||||
.CodeMirror {
|
||||
height: 600px;
|
||||
border: 1px solid #334155;
|
||||
border-radius: 4px;
|
||||
font-size: 14px;
|
||||
background: #0f172a;
|
||||
}
|
||||
|
||||
.editor-actions {
|
||||
margin-top: 1.5rem;
|
||||
display: flex;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.validation-feedback {
|
||||
margin-top: 1rem;
|
||||
padding: 1rem;
|
||||
border-radius: 4px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.validation-feedback.success {
|
||||
background: #065f46;
|
||||
border: 1px solid #10b981;
|
||||
color: #d1fae5;
|
||||
}
|
||||
|
||||
.validation-feedback.error {
|
||||
background: #7f1d1d;
|
||||
border: 1px solid #ef4444;
|
||||
color: #fee2e2;
|
||||
}
|
||||
|
||||
.back-link {
|
||||
color: #94a3b8;
|
||||
text-decoration: none;
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
margin-bottom: 1rem;
|
||||
}
|
||||
|
||||
.back-link:hover {
|
||||
color: #cbd5e1;
|
||||
}
|
||||
</style>
|
||||
{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="container-lg mt-4">
|
||||
<a href="{{ url_for('main.configs') }}" class="back-link">
|
||||
<i class="bi bi-arrow-left"></i> Back to Configs
|
||||
</a>
|
||||
|
||||
<h2>Edit Configuration</h2>
|
||||
<p class="text-muted">Edit the YAML configuration for <strong>{{ filename }}</strong></p>
|
||||
|
||||
<div class="config-editor-container">
|
||||
<div class="editor-header">
|
||||
<h5 class="mb-0">
|
||||
<i class="bi bi-file-earmark-code"></i> YAML Editor
|
||||
</h5>
|
||||
<button type="button" class="btn btn-sm btn-outline-primary" onclick="validateConfig()">
|
||||
<i class="bi bi-check-circle"></i> Validate
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<textarea id="yaml-editor">{{ content }}</textarea>
|
||||
|
||||
<div class="validation-feedback" id="validation-feedback"></div>
|
||||
|
||||
<div class="editor-actions">
|
||||
<button type="button" class="btn btn-primary" onclick="saveConfig()">
|
||||
<i class="bi bi-save"></i> Save Changes
|
||||
</button>
|
||||
<button type="button" class="btn btn-secondary" onclick="resetEditor()">
|
||||
<i class="bi bi-arrow-counterclockwise"></i> Reset
|
||||
</button>
|
||||
<a href="{{ url_for('main.configs') }}" class="btn btn-outline-secondary">
|
||||
<i class="bi bi-x-circle"></i> Cancel
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Success Modal -->
|
||||
<div class="modal fade" id="successModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header bg-success text-white">
|
||||
<h5 class="modal-title">
|
||||
<i class="bi bi-check-circle-fill"></i> Success
|
||||
</h5>
|
||||
<button type="button" class="btn-close btn-close-white" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
Configuration updated successfully!
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<a href="{{ url_for('main.configs') }}" class="btn btn-success">
|
||||
Back to Configs
|
||||
</a>
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">
|
||||
Continue Editing
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<!-- CodeMirror JS -->
|
||||
<script src="https://cdnjs.cloudflare.com/ajax/libs/codemirror/5.65.2/codemirror.min.js"></script>
|
||||
<script src="https://cdnjs.cloudflare.com/ajax/libs/codemirror/5.65.2/mode/yaml/yaml.min.js"></script>
|
||||
|
||||
<script>
|
||||
// Initialize CodeMirror editor
|
||||
const editor = CodeMirror.fromTextArea(document.getElementById('yaml-editor'), {
|
||||
mode: 'yaml',
|
||||
theme: 'dracula',
|
||||
lineNumbers: true,
|
||||
lineWrapping: true,
|
||||
indentUnit: 2,
|
||||
tabSize: 2,
|
||||
indentWithTabs: false,
|
||||
extraKeys: {
|
||||
"Tab": function(cm) {
|
||||
cm.replaceSelection(" ", "end");
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Store original content for reset
|
||||
const originalContent = editor.getValue();
|
||||
|
||||
// Validation function
|
||||
async function validateConfig() {
|
||||
const feedback = document.getElementById('validation-feedback');
|
||||
const content = editor.getValue();
|
||||
|
||||
try {
|
||||
// Basic YAML syntax check (client-side)
|
||||
// Just check for common YAML issues
|
||||
if (content.trim() === '') {
|
||||
showFeedback('error', 'Configuration cannot be empty');
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check for basic structure
|
||||
if (!content.includes('title:')) {
|
||||
showFeedback('error', 'Missing required field: title');
|
||||
return false;
|
||||
}
|
||||
|
||||
if (!content.includes('sites:')) {
|
||||
showFeedback('error', 'Missing required field: sites');
|
||||
return false;
|
||||
}
|
||||
|
||||
showFeedback('success', 'Configuration appears valid. Click "Save Changes" to save.');
|
||||
return true;
|
||||
} catch (error) {
|
||||
showFeedback('error', 'Validation error: ' + error.message);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Save configuration
|
||||
async function saveConfig() {
|
||||
const content = editor.getValue();
|
||||
const filename = '{{ filename }}';
|
||||
|
||||
// Show loading state
|
||||
const saveBtn = event.target;
|
||||
const originalText = saveBtn.innerHTML;
|
||||
saveBtn.disabled = true;
|
||||
saveBtn.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Saving...';
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/configs/${filename}`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({ content: content })
|
||||
});
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (response.ok) {
|
||||
// Show success modal
|
||||
const modal = new bootstrap.Modal(document.getElementById('successModal'));
|
||||
modal.show();
|
||||
} else {
|
||||
// Show error feedback
|
||||
showFeedback('error', data.message || 'Failed to save configuration');
|
||||
}
|
||||
} catch (error) {
|
||||
showFeedback('error', 'Network error: ' + error.message);
|
||||
} finally {
|
||||
// Restore button state
|
||||
saveBtn.disabled = false;
|
||||
saveBtn.innerHTML = originalText;
|
||||
}
|
||||
}
|
||||
|
||||
// Reset editor to original content
|
||||
function resetEditor() {
|
||||
if (confirm('Are you sure you want to reset all changes?')) {
|
||||
editor.setValue(originalContent);
|
||||
hideFeedback();
|
||||
}
|
||||
}
|
||||
|
||||
// Show validation feedback
|
||||
function showFeedback(type, message) {
|
||||
const feedback = document.getElementById('validation-feedback');
|
||||
feedback.className = `validation-feedback ${type}`;
|
||||
feedback.innerHTML = `
|
||||
<i class="bi bi-${type === 'success' ? 'check-circle-fill' : 'exclamation-triangle-fill'}"></i>
|
||||
${message}
|
||||
`;
|
||||
feedback.style.display = 'block';
|
||||
}
|
||||
|
||||
// Hide validation feedback
|
||||
function hideFeedback() {
|
||||
const feedback = document.getElementById('validation-feedback');
|
||||
feedback.style.display = 'none';
|
||||
}
|
||||
|
||||
// Auto-validate on content change (debounced)
|
||||
let validationTimeout;
|
||||
editor.on('change', function() {
|
||||
clearTimeout(validationTimeout);
|
||||
hideFeedback();
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -1,415 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Create Configuration - SneakyScanner{% endblock %}
|
||||
|
||||
{% block extra_styles %}
|
||||
<link rel="stylesheet" href="{{ url_for('static', filename='css/config-manager.css') }}">
|
||||
<style>
|
||||
.file-info {
|
||||
background-color: #1e293b;
|
||||
border: 1px solid #334155;
|
||||
padding: 10px 15px;
|
||||
border-radius: 5px;
|
||||
margin-top: 15px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.file-info-name {
|
||||
color: #60a5fa;
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
.file-info-size {
|
||||
color: #94a3b8;
|
||||
font-size: 0.9em;
|
||||
}
|
||||
</style>
|
||||
{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12">
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<h1 style="color: #60a5fa;">Create New Configuration</h1>
|
||||
<a href="{{ url_for('main.configs') }}" class="btn btn-secondary">
|
||||
<i class="bi bi-arrow-left"></i> Back to Configs
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Upload Tabs -->
|
||||
<div class="row">
|
||||
<div class="col-12">
|
||||
<ul class="nav nav-tabs mb-4" id="uploadTabs" role="tablist">
|
||||
<li class="nav-item" role="presentation">
|
||||
<button class="nav-link active" id="cidr-tab" data-bs-toggle="tab" data-bs-target="#cidr"
|
||||
type="button" role="tab" style="color: #60a5fa;">
|
||||
<i class="bi bi-diagram-3"></i> Create from CIDR
|
||||
</button>
|
||||
</li>
|
||||
<li class="nav-item" role="presentation">
|
||||
<button class="nav-link" id="yaml-tab" data-bs-toggle="tab" data-bs-target="#yaml"
|
||||
type="button" role="tab" style="color: #60a5fa;">
|
||||
<i class="bi bi-filetype-yml"></i> Upload YAML
|
||||
</button>
|
||||
</li>
|
||||
</ul>
|
||||
|
||||
<div class="tab-content" id="uploadTabsContent">
|
||||
<!-- CIDR Form Tab -->
|
||||
<div class="tab-pane fade show active" id="cidr" role="tabpanel">
|
||||
<div class="row">
|
||||
<div class="col-lg-8 offset-lg-2">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">
|
||||
<i class="bi bi-diagram-3"></i> Create Configuration from CIDR Range
|
||||
</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<p class="text-muted">
|
||||
<i class="bi bi-info-circle"></i>
|
||||
Specify a CIDR range to automatically generate a configuration for all IPs in that range.
|
||||
You can edit the configuration afterwards to add expected ports and services.
|
||||
</p>
|
||||
|
||||
<form id="cidr-form">
|
||||
<div class="mb-3">
|
||||
<label for="config-title" class="form-label" style="color: #94a3b8;">
|
||||
Config Title <span class="text-danger">*</span>
|
||||
</label>
|
||||
<input type="text" class="form-control" id="config-title"
|
||||
placeholder="e.g., Production Infrastructure Scan" required>
|
||||
<div class="form-text">A descriptive title for your scan configuration</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="cidr-range" class="form-label" style="color: #94a3b8;">
|
||||
CIDR Range <span class="text-danger">*</span>
|
||||
</label>
|
||||
<input type="text" class="form-control" id="cidr-range"
|
||||
placeholder="e.g., 10.0.0.0/24 or 192.168.1.0/28" required>
|
||||
<div class="form-text">
|
||||
Enter a CIDR range (e.g., 10.0.0.0/24 for 254 hosts).
|
||||
Maximum 10,000 addresses per range.
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="site-name" class="form-label" style="color: #94a3b8;">
|
||||
Site Name (optional)
|
||||
</label>
|
||||
<input type="text" class="form-control" id="site-name"
|
||||
placeholder="e.g., Production Servers">
|
||||
<div class="form-text">
|
||||
Logical grouping name for these IPs (default: "Site 1")
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="checkbox" id="ping-default">
|
||||
<label class="form-check-label" for="ping-default" style="color: #94a3b8;">
|
||||
Expect ping response by default
|
||||
</label>
|
||||
</div>
|
||||
<div class="form-text">
|
||||
Sets the default expectation for ICMP ping responses from these IPs
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="cidr-errors" class="alert alert-danger" style="display:none;">
|
||||
<strong>Error:</strong> <span id="cidr-error-message"></span>
|
||||
</div>
|
||||
|
||||
<div class="d-grid gap-2">
|
||||
<button type="submit" class="btn btn-primary btn-lg">
|
||||
<i class="bi bi-plus-circle"></i> Create Configuration
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
|
||||
<div id="cidr-success" class="alert alert-success mt-3" style="display:none;">
|
||||
<i class="bi bi-check-circle-fill"></i>
|
||||
<strong>Success!</strong> Configuration created: <span id="cidr-created-filename"></span>
|
||||
<div class="mt-2">
|
||||
<a href="#" id="edit-new-config-link" class="btn btn-sm btn-outline-success">
|
||||
<i class="bi bi-pencil"></i> Edit Now
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- YAML Upload Tab -->
|
||||
<div class="tab-pane fade" id="yaml" role="tabpanel">
|
||||
<div class="row">
|
||||
<div class="col-lg-8 offset-lg-2">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">
|
||||
<i class="bi bi-cloud-upload"></i> Upload YAML Configuration
|
||||
</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<p class="text-muted">
|
||||
<i class="bi bi-info-circle"></i>
|
||||
For advanced users: upload a YAML config file directly.
|
||||
</p>
|
||||
|
||||
<div id="yaml-dropzone" class="dropzone">
|
||||
<i class="bi bi-cloud-upload"></i>
|
||||
<p>Drag & drop YAML file here or click to browse</p>
|
||||
<input type="file" id="yaml-file-input" accept=".yaml,.yml" hidden>
|
||||
</div>
|
||||
|
||||
<div id="yaml-file-info" class="file-info">
|
||||
<div class="file-info-name" id="yaml-filename"></div>
|
||||
<div class="file-info-size" id="yaml-filesize"></div>
|
||||
</div>
|
||||
|
||||
<div class="mt-3">
|
||||
<label for="yaml-custom-filename" class="form-label" style="color: #94a3b8;">
|
||||
Custom Filename (optional):
|
||||
</label>
|
||||
<input type="text" id="yaml-custom-filename" class="form-control"
|
||||
placeholder="Leave empty to use uploaded filename">
|
||||
</div>
|
||||
|
||||
<button id="upload-yaml-btn" class="btn btn-primary mt-3" disabled>
|
||||
<i class="bi bi-upload"></i> Upload YAML
|
||||
</button>
|
||||
|
||||
<div id="yaml-errors" class="alert alert-danger mt-3" style="display:none;">
|
||||
<strong>Error:</strong> <span id="yaml-error-message"></span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Success Modal -->
|
||||
<div class="modal fade" id="successModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content" style="background-color: #1e293b; border: 1px solid #334155;">
|
||||
<div class="modal-header" style="border-bottom: 1px solid #334155;">
|
||||
<h5 class="modal-title" style="color: #10b981;">
|
||||
<i class="bi bi-check-circle"></i> Success
|
||||
</h5>
|
||||
<button type="button" class="btn-close btn-close-white" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p style="color: #e2e8f0;">Configuration saved successfully!</p>
|
||||
<p style="color: #60a5fa; font-weight: bold;" id="success-filename"></p>
|
||||
</div>
|
||||
<div class="modal-footer" style="border-top: 1px solid #334155;">
|
||||
<a href="{{ url_for('main.configs') }}" class="btn btn-primary">
|
||||
<i class="bi bi-list"></i> View All Configs
|
||||
</a>
|
||||
<button type="button" class="btn btn-success" onclick="location.reload()">
|
||||
<i class="bi bi-plus-circle"></i> Create Another
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
// Global variables
|
||||
let yamlFile = null;
|
||||
|
||||
// ============== CIDR Form Submission ==============
|
||||
|
||||
document.getElementById('cidr-form').addEventListener('submit', async function(e) {
|
||||
e.preventDefault();
|
||||
|
||||
const title = document.getElementById('config-title').value.trim();
|
||||
const cidr = document.getElementById('cidr-range').value.trim();
|
||||
const siteName = document.getElementById('site-name').value.trim();
|
||||
const pingDefault = document.getElementById('ping-default').checked;
|
||||
|
||||
// Validate inputs
|
||||
if (!title) {
|
||||
showError('cidr', 'Config title is required');
|
||||
return;
|
||||
}
|
||||
|
||||
if (!cidr) {
|
||||
showError('cidr', 'CIDR range is required');
|
||||
return;
|
||||
}
|
||||
|
||||
// Show loading state
|
||||
const submitBtn = e.target.querySelector('button[type="submit"]');
|
||||
const originalText = submitBtn.innerHTML;
|
||||
submitBtn.disabled = true;
|
||||
submitBtn.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Creating...';
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/configs/create-from-cidr', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
title: title,
|
||||
cidr: cidr,
|
||||
site_name: siteName || null,
|
||||
ping_default: pingDefault
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.message || `HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
// Hide error, show success
|
||||
document.getElementById('cidr-errors').style.display = 'none';
|
||||
document.getElementById('cidr-created-filename').textContent = data.filename;
|
||||
|
||||
// Set edit link
|
||||
document.getElementById('edit-new-config-link').href = `/configs/edit/${data.filename}`;
|
||||
|
||||
document.getElementById('cidr-success').style.display = 'block';
|
||||
|
||||
// Reset form
|
||||
e.target.reset();
|
||||
|
||||
// Show success modal
|
||||
showSuccess(data.filename);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error creating config from CIDR:', error);
|
||||
showError('cidr', error.message);
|
||||
} finally {
|
||||
// Restore button state
|
||||
submitBtn.disabled = false;
|
||||
submitBtn.innerHTML = originalText;
|
||||
}
|
||||
});
|
||||
|
||||
// ============== YAML Upload ==============
|
||||
|
||||
// Setup YAML dropzone
|
||||
const yamlDropzone = document.getElementById('yaml-dropzone');
|
||||
const yamlFileInput = document.getElementById('yaml-file-input');
|
||||
|
||||
yamlDropzone.addEventListener('click', () => yamlFileInput.click());
|
||||
|
||||
yamlDropzone.addEventListener('dragover', (e) => {
|
||||
e.preventDefault();
|
||||
yamlDropzone.classList.add('dragover');
|
||||
});
|
||||
|
||||
yamlDropzone.addEventListener('dragleave', () => {
|
||||
yamlDropzone.classList.remove('dragover');
|
||||
});
|
||||
|
||||
yamlDropzone.addEventListener('drop', (e) => {
|
||||
e.preventDefault();
|
||||
yamlDropzone.classList.remove('dragover');
|
||||
const file = e.dataTransfer.files[0];
|
||||
handleYAMLFile(file);
|
||||
});
|
||||
|
||||
yamlFileInput.addEventListener('change', (e) => {
|
||||
const file = e.target.files[0];
|
||||
handleYAMLFile(file);
|
||||
});
|
||||
|
||||
// Handle YAML file selection
|
||||
function handleYAMLFile(file) {
|
||||
if (!file) return;
|
||||
|
||||
// Check file extension
|
||||
if (!file.name.endsWith('.yaml') && !file.name.endsWith('.yml')) {
|
||||
showError('yaml', 'Please select a YAML file (.yaml or .yml)');
|
||||
return;
|
||||
}
|
||||
|
||||
yamlFile = file;
|
||||
|
||||
// Show file info
|
||||
document.getElementById('yaml-filename').textContent = file.name;
|
||||
document.getElementById('yaml-filesize').textContent = formatFileSize(file.size);
|
||||
document.getElementById('yaml-file-info').style.display = 'block';
|
||||
|
||||
// Enable upload button
|
||||
document.getElementById('upload-yaml-btn').disabled = false;
|
||||
document.getElementById('yaml-errors').style.display = 'none';
|
||||
}
|
||||
|
||||
// Upload YAML file
|
||||
async function uploadYAMLFile() {
|
||||
if (!yamlFile) return;
|
||||
|
||||
try {
|
||||
const formData = new FormData();
|
||||
formData.append('file', yamlFile);
|
||||
|
||||
const customFilename = document.getElementById('yaml-custom-filename').value.trim();
|
||||
if (customFilename) {
|
||||
formData.append('filename', customFilename);
|
||||
}
|
||||
|
||||
const response = await fetch('/api/configs/upload-yaml', {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.message || `HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
showSuccess(data.filename);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error uploading YAML:', error);
|
||||
showError('yaml', error.message);
|
||||
}
|
||||
}
|
||||
|
||||
document.getElementById('upload-yaml-btn').addEventListener('click', uploadYAMLFile);
|
||||
|
||||
// ============== Helper Functions ==============
|
||||
|
||||
// Format file size
|
||||
function formatFileSize(bytes) {
|
||||
if (bytes === 0) return '0 Bytes';
|
||||
const k = 1024;
|
||||
const sizes = ['Bytes', 'KB', 'MB'];
|
||||
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
||||
return Math.round((bytes / Math.pow(k, i)) * 100) / 100 + ' ' + sizes[i];
|
||||
}
|
||||
|
||||
// Show error
|
||||
function showError(type, message) {
|
||||
const errorDiv = document.getElementById(`${type}-errors`);
|
||||
const errorMsg = document.getElementById(`${type}-error-message`);
|
||||
errorMsg.textContent = message;
|
||||
errorDiv.style.display = 'block';
|
||||
}
|
||||
|
||||
// Show success
|
||||
function showSuccess(filename) {
|
||||
document.getElementById('success-filename').textContent = filename;
|
||||
new bootstrap.Modal(document.getElementById('successModal')).show();
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -1,20 +1,16 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Configuration Files - SneakyScanner{% endblock %}
|
||||
|
||||
{% block extra_styles %}
|
||||
<link rel="stylesheet" href="{{ url_for('static', filename='css/config-manager.css') }}">
|
||||
{% endblock %}
|
||||
{% block title %}Scan Configurations - SneakyScanner{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12">
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<h1 style="color: #60a5fa;">Configuration Files</h1>
|
||||
<h1>Scan Configurations</h1>
|
||||
<div>
|
||||
<a href="{{ url_for('main.upload_config') }}" class="btn btn-primary">
|
||||
<button class="btn btn-primary" data-bs-toggle="modal" data-bs-target="#createConfigModal">
|
||||
<i class="bi bi-plus-circle"></i> Create New Config
|
||||
</a>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@@ -30,14 +26,14 @@
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
<div class="stat-card">
|
||||
<div class="stat-value" id="configs-in-use">-</div>
|
||||
<div class="stat-label">In Use by Schedules</div>
|
||||
<div class="stat-value" id="total-sites-used">-</div>
|
||||
<div class="stat-label">Total Sites Referenced</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
<div class="stat-card">
|
||||
<div class="stat-value" id="total-size">-</div>
|
||||
<div class="stat-label">Total Size</div>
|
||||
<div class="stat-value" id="recent-updates">-</div>
|
||||
<div class="stat-label">Updated This Week</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@@ -48,7 +44,7 @@
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<div class="d-flex justify-content-between align-items-center">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">All Configurations</h5>
|
||||
<h5 class="mb-0">All Configurations</h5>
|
||||
<input type="text" id="search-input" class="form-control" style="max-width: 300px;"
|
||||
placeholder="Search configs...">
|
||||
</div>
|
||||
@@ -68,11 +64,10 @@
|
||||
<table class="table table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Filename</th>
|
||||
<th>Title</th>
|
||||
<th>Created</th>
|
||||
<th>Size</th>
|
||||
<th>Used By</th>
|
||||
<th>Description</th>
|
||||
<th>Sites</th>
|
||||
<th>Updated</th>
|
||||
<th>Actions</th>
|
||||
</tr>
|
||||
</thead>
|
||||
@@ -82,12 +77,12 @@
|
||||
</table>
|
||||
</div>
|
||||
<div id="empty-state" style="display: none;" class="text-center py-5">
|
||||
<i class="bi bi-file-earmark-text" style="font-size: 3rem; color: #64748b;"></i>
|
||||
<h5 class="mt-3 text-muted">No configuration files</h5>
|
||||
<p class="text-muted">Create your first config to define scan targets</p>
|
||||
<a href="{{ url_for('main.upload_config') }}" class="btn btn-primary mt-2">
|
||||
<i class="bi bi-gear" style="font-size: 3rem; color: #64748b;"></i>
|
||||
<h5 class="mt-3 text-muted">No configurations defined</h5>
|
||||
<p class="text-muted">Create your first scan configuration</p>
|
||||
<button class="btn btn-primary mt-2" data-bs-toggle="modal" data-bs-target="#createConfigModal">
|
||||
<i class="bi bi-plus-circle"></i> Create Config
|
||||
</a>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@@ -95,28 +90,99 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Delete Confirmation Modal -->
|
||||
<div class="modal fade" id="deleteModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content" style="background-color: #1e293b; border: 1px solid #334155;">
|
||||
<div class="modal-header" style="border-bottom: 1px solid #334155;">
|
||||
<h5 class="modal-title" style="color: #f87171;">
|
||||
<i class="bi bi-exclamation-triangle"></i> Confirm Deletion
|
||||
<!-- Create Config Modal -->
|
||||
<div class="modal fade" id="createConfigModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">
|
||||
<i class="bi bi-plus-circle me-2"></i>Create New Configuration
|
||||
</h5>
|
||||
<button type="button" class="btn-close btn-close-white" data-bs-dismiss="modal"></button>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p style="color: #e2e8f0;">Are you sure you want to delete the config file:</p>
|
||||
<p style="color: #60a5fa; font-weight: bold;" id="delete-config-name"></p>
|
||||
<p style="color: #fbbf24;" id="delete-warning-schedules" style="display: none;">
|
||||
<i class="bi bi-exclamation-circle"></i>
|
||||
This config is used by schedules and cannot be deleted.
|
||||
</p>
|
||||
<form id="create-config-form">
|
||||
<div class="mb-3">
|
||||
<label for="config-title" class="form-label">Title <span class="text-danger">*</span></label>
|
||||
<input type="text" class="form-control" id="config-title" required
|
||||
placeholder="e.g., Production Weekly Scan">
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="config-description" class="form-label">Description</label>
|
||||
<textarea class="form-control" id="config-description" rows="3"
|
||||
placeholder="Optional description of this configuration"></textarea>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label class="form-label">Sites <span class="text-danger">*</span></label>
|
||||
<div id="sites-loading-modal" class="text-center py-3">
|
||||
<div class="spinner-border spinner-border-sm text-primary" role="status">
|
||||
<span class="visually-hidden">Loading...</span>
|
||||
</div>
|
||||
<span class="ms-2 text-muted">Loading available sites...</span>
|
||||
</div>
|
||||
<div id="sites-list" style="display: none;">
|
||||
<!-- Populated by JavaScript -->
|
||||
</div>
|
||||
<small class="form-text text-muted">Select at least one site to include in this configuration</small>
|
||||
</div>
|
||||
|
||||
<div class="alert alert-danger" id="create-config-error" style="display: none;">
|
||||
<span id="create-config-error-message"></span>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
<div class="modal-footer" style="border-top: 1px solid #334155;">
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
|
||||
<button type="button" class="btn btn-danger" id="confirm-delete-btn">
|
||||
<i class="bi bi-trash"></i> Delete
|
||||
<button type="button" class="btn btn-primary" id="create-config-btn">
|
||||
<i class="bi bi-check-circle me-1"></i>Create Configuration
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Edit Config Modal -->
|
||||
<div class="modal fade" id="editConfigModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">
|
||||
<i class="bi bi-pencil me-2"></i>Edit Configuration
|
||||
</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form id="edit-config-form">
|
||||
<input type="hidden" id="edit-config-id">
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="edit-config-title" class="form-label">Title <span class="text-danger">*</span></label>
|
||||
<input type="text" class="form-control" id="edit-config-title" required>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="edit-config-description" class="form-label">Description</label>
|
||||
<textarea class="form-control" id="edit-config-description" rows="3"></textarea>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label class="form-label">Sites <span class="text-danger">*</span></label>
|
||||
<div id="edit-sites-list">
|
||||
<!-- Populated by JavaScript -->
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="alert alert-danger" id="edit-config-error" style="display: none;">
|
||||
<span id="edit-config-error-message"></span>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
|
||||
<button type="button" class="btn btn-primary" id="edit-config-btn">
|
||||
<i class="bi bi-check-circle me-1"></i>Save Changes
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
@@ -124,25 +190,47 @@
|
||||
</div>
|
||||
|
||||
<!-- View Config Modal -->
|
||||
<div class="modal fade" id="viewModal" tabindex="-1">
|
||||
<div class="modal fade" id="viewConfigModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content" style="background-color: #1e293b; border: 1px solid #334155;">
|
||||
<div class="modal-header" style="border-bottom: 1px solid #334155;">
|
||||
<h5 class="modal-title" style="color: #60a5fa;">
|
||||
<i class="bi bi-file-earmark-code"></i> Config File Details
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">
|
||||
<i class="bi bi-eye me-2"></i>Configuration Details
|
||||
</h5>
|
||||
<button type="button" class="btn-close btn-close-white" data-bs-dismiss="modal"></button>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<h6 style="color: #94a3b8;">Filename: <span id="view-filename" style="color: #e2e8f0;"></span></h6>
|
||||
<h6 class="mt-3" style="color: #94a3b8;">Content:</h6>
|
||||
<pre style="background-color: #0f172a; border: 1px solid #334155; padding: 15px; border-radius: 5px; max-height: 400px; overflow-y: auto;"><code id="view-content" style="color: #e2e8f0;"></code></pre>
|
||||
<div id="view-config-content">
|
||||
<!-- Populated by JavaScript -->
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer" style="border-top: 1px solid #334155;">
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
||||
<a id="download-link" href="#" class="btn btn-primary">
|
||||
<i class="bi bi-download"></i> Download
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Delete Confirmation Modal -->
|
||||
<div class="modal fade" id="deleteConfigModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title text-danger">
|
||||
<i class="bi bi-trash me-2"></i>Delete Configuration
|
||||
</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p>Are you sure you want to delete configuration <strong id="delete-config-name"></strong>?</p>
|
||||
<p class="text-warning"><i class="bi bi-exclamation-triangle"></i> This action cannot be undone.</p>
|
||||
<input type="hidden" id="delete-config-id">
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
|
||||
<button type="button" class="btn btn-danger" id="confirm-delete-btn">
|
||||
<i class="bi bi-trash me-1"></i>Delete
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@@ -152,47 +240,90 @@
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
// Global variables
|
||||
let configsData = [];
|
||||
let selectedConfigForDeletion = null;
|
||||
// Global state
|
||||
let allConfigs = [];
|
||||
let allSites = [];
|
||||
|
||||
// Format file size
|
||||
function formatFileSize(bytes) {
|
||||
if (bytes === 0) return '0 Bytes';
|
||||
const k = 1024;
|
||||
const sizes = ['Bytes', 'KB', 'MB'];
|
||||
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
||||
return Math.round((bytes / Math.pow(k, i)) * 100) / 100 + ' ' + sizes[i];
|
||||
}
|
||||
// Load data on page load
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
loadSites();
|
||||
loadConfigs();
|
||||
});
|
||||
|
||||
// Format date
|
||||
function formatDate(timestamp) {
|
||||
if (!timestamp) return 'Unknown';
|
||||
const date = new Date(timestamp);
|
||||
return date.toLocaleString();
|
||||
}
|
||||
|
||||
// Load configs from API
|
||||
async function loadConfigs() {
|
||||
// Load all sites
|
||||
async function loadSites() {
|
||||
try {
|
||||
document.getElementById('configs-loading').style.display = 'block';
|
||||
document.getElementById('configs-error').style.display = 'none';
|
||||
document.getElementById('configs-content').style.display = 'none';
|
||||
|
||||
const response = await fetch('/api/configs');
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
|
||||
}
|
||||
const response = await fetch('/api/sites?all=true');
|
||||
if (!response.ok) throw new Error('Failed to load sites');
|
||||
|
||||
const data = await response.json();
|
||||
configsData = data.configs || [];
|
||||
allSites = data.sites || [];
|
||||
|
||||
renderSitesCheckboxes();
|
||||
} catch (error) {
|
||||
console.error('Error loading sites:', error);
|
||||
document.getElementById('sites-loading-modal').innerHTML =
|
||||
'<div class="alert alert-danger">Failed to load sites</div>';
|
||||
}
|
||||
}
|
||||
|
||||
// Render sites checkboxes
|
||||
function renderSitesCheckboxes(selectedIds = [], isEditMode = false) {
|
||||
const container = isEditMode ? document.getElementById('edit-sites-list') : document.getElementById('sites-list');
|
||||
|
||||
if (!container) return;
|
||||
|
||||
if (allSites.length === 0) {
|
||||
const message = '<div class="alert alert-info">No sites available. <a href="/sites">Create a site first</a>.</div>';
|
||||
container.innerHTML = message;
|
||||
if (!isEditMode) {
|
||||
document.getElementById('sites-loading-modal').style.display = 'none';
|
||||
container.style.display = 'block';
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
const prefix = isEditMode ? 'edit-site' : 'site';
|
||||
const checkboxClass = isEditMode ? 'edit-site-checkbox' : 'site-checkbox';
|
||||
|
||||
let html = '<div style="max-height: 300px; overflow-y: auto;">';
|
||||
allSites.forEach(site => {
|
||||
const isChecked = selectedIds.includes(site.id);
|
||||
html += `
|
||||
<div class="form-check">
|
||||
<input class="form-check-input ${checkboxClass}" type="checkbox" value="${site.id}"
|
||||
id="${prefix}-${site.id}" ${isChecked ? 'checked' : ''}>
|
||||
<label class="form-check-label" for="${prefix}-${site.id}">
|
||||
${escapeHtml(site.name)}
|
||||
<small class="text-muted">(${site.ip_count || 0} IP${site.ip_count !== 1 ? 's' : ''})</small>
|
||||
</label>
|
||||
</div>
|
||||
`;
|
||||
});
|
||||
html += '</div>';
|
||||
|
||||
container.innerHTML = html;
|
||||
|
||||
if (!isEditMode) {
|
||||
document.getElementById('sites-loading-modal').style.display = 'none';
|
||||
container.style.display = 'block';
|
||||
}
|
||||
}
|
||||
|
||||
// Load all configs
|
||||
async function loadConfigs() {
|
||||
try {
|
||||
const response = await fetch('/api/configs');
|
||||
if (!response.ok) throw new Error('Failed to load configs');
|
||||
|
||||
const data = await response.json();
|
||||
allConfigs = data.configs || [];
|
||||
|
||||
renderConfigs();
|
||||
updateStats();
|
||||
renderConfigs(configsData);
|
||||
|
||||
document.getElementById('configs-loading').style.display = 'none';
|
||||
document.getElementById('configs-content').style.display = 'block';
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error loading configs:', error);
|
||||
document.getElementById('configs-loading').style.display = 'none';
|
||||
@@ -201,177 +332,249 @@ async function loadConfigs() {
|
||||
}
|
||||
}
|
||||
|
||||
// Update summary stats
|
||||
function updateStats() {
|
||||
const totalConfigs = configsData.length;
|
||||
const configsInUse = configsData.filter(c => c.used_by_schedules && c.used_by_schedules.length > 0).length;
|
||||
const totalSize = configsData.reduce((sum, c) => sum + (c.size_bytes || 0), 0);
|
||||
|
||||
document.getElementById('total-configs').textContent = totalConfigs;
|
||||
document.getElementById('configs-in-use').textContent = configsInUse;
|
||||
document.getElementById('total-size').textContent = formatFileSize(totalSize);
|
||||
}
|
||||
|
||||
// Render configs table
|
||||
function renderConfigs(configs) {
|
||||
function renderConfigs(filter = '') {
|
||||
const tbody = document.getElementById('configs-tbody');
|
||||
const emptyState = document.getElementById('empty-state');
|
||||
|
||||
if (configs.length === 0) {
|
||||
const filteredConfigs = filter
|
||||
? allConfigs.filter(c =>
|
||||
c.title.toLowerCase().includes(filter.toLowerCase()) ||
|
||||
(c.description && c.description.toLowerCase().includes(filter.toLowerCase()))
|
||||
)
|
||||
: allConfigs;
|
||||
|
||||
if (filteredConfigs.length === 0) {
|
||||
tbody.innerHTML = '';
|
||||
emptyState.style.display = 'block';
|
||||
return;
|
||||
}
|
||||
|
||||
emptyState.style.display = 'none';
|
||||
|
||||
tbody.innerHTML = configs.map(config => {
|
||||
const usedByBadge = config.used_by_schedules && config.used_by_schedules.length > 0
|
||||
? `<span class="badge bg-info" title="${config.used_by_schedules.join(', ')}">${config.used_by_schedules.length} schedule(s)</span>`
|
||||
: '<span class="badge bg-secondary">Not used</span>';
|
||||
|
||||
return `
|
||||
<tr>
|
||||
<td><code>${config.filename}</code></td>
|
||||
<td>${config.title || config.filename}</td>
|
||||
<td>${formatDate(config.created_at)}</td>
|
||||
<td>${formatFileSize(config.size_bytes || 0)}</td>
|
||||
<td>${usedByBadge}</td>
|
||||
<td>
|
||||
<div class="btn-group btn-group-sm" role="group">
|
||||
<button class="btn btn-outline-primary" onclick="viewConfig('${config.filename}')" title="View">
|
||||
<i class="bi bi-eye"></i>
|
||||
</button>
|
||||
<a href="/configs/edit/${config.filename}" class="btn btn-outline-info" title="Edit">
|
||||
<i class="bi bi-pencil"></i>
|
||||
</a>
|
||||
<a href="/api/configs/${config.filename}/download" class="btn btn-outline-success" title="Download">
|
||||
<i class="bi bi-download"></i>
|
||||
</a>
|
||||
<button class="btn btn-outline-danger" onclick="confirmDelete('${config.filename}', ${config.used_by_schedules.length > 0})" title="Delete">
|
||||
<i class="bi bi-trash"></i>
|
||||
</button>
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
`;
|
||||
}).join('');
|
||||
tbody.innerHTML = filteredConfigs.map(config => `
|
||||
<tr>
|
||||
<td><strong>${escapeHtml(config.title)}</strong></td>
|
||||
<td>${config.description ? escapeHtml(config.description) : '<span class="text-muted">-</span>'}</td>
|
||||
<td>
|
||||
<span class="badge bg-primary">${config.site_count} site${config.site_count !== 1 ? 's' : ''}</span>
|
||||
</td>
|
||||
<td>${formatDate(config.updated_at)}</td>
|
||||
<td>
|
||||
<button class="btn btn-sm btn-info" onclick="viewConfig(${config.id})" title="View">
|
||||
<i class="bi bi-eye"></i>
|
||||
</button>
|
||||
<button class="btn btn-sm btn-warning" onclick="editConfig(${config.id})" title="Edit">
|
||||
<i class="bi bi-pencil"></i>
|
||||
</button>
|
||||
<button class="btn btn-sm btn-danger" onclick="deleteConfig(${config.id}, '${escapeHtml(config.title).replace(/'/g, "\\'")}');" title="Delete">
|
||||
<i class="bi bi-trash"></i>
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
`).join('');
|
||||
}
|
||||
|
||||
// View config details
|
||||
async function viewConfig(filename) {
|
||||
try {
|
||||
const response = await fetch(`/api/configs/${filename}`);
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to load config: ${response.statusText}`);
|
||||
}
|
||||
// Update stats
|
||||
function updateStats() {
|
||||
document.getElementById('total-configs').textContent = allConfigs.length;
|
||||
|
||||
const data = await response.json();
|
||||
const uniqueSites = new Set();
|
||||
allConfigs.forEach(c => c.sites.forEach(s => uniqueSites.add(s.id)));
|
||||
document.getElementById('total-sites-used').textContent = uniqueSites.size;
|
||||
|
||||
document.getElementById('view-filename').textContent = data.filename;
|
||||
document.getElementById('view-content').textContent = data.content;
|
||||
document.getElementById('download-link').href = `/api/configs/${filename}/download`;
|
||||
|
||||
new bootstrap.Modal(document.getElementById('viewModal')).show();
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error viewing config:', error);
|
||||
alert(`Error: ${error.message}`);
|
||||
}
|
||||
const oneWeekAgo = new Date();
|
||||
oneWeekAgo.setDate(oneWeekAgo.getDate() - 7);
|
||||
const recentUpdates = allConfigs.filter(c => new Date(c.updated_at) > oneWeekAgo).length;
|
||||
document.getElementById('recent-updates').textContent = recentUpdates;
|
||||
}
|
||||
|
||||
// Confirm delete
|
||||
function confirmDelete(filename, isInUse) {
|
||||
selectedConfigForDeletion = filename;
|
||||
document.getElementById('delete-config-name').textContent = filename;
|
||||
|
||||
const warningDiv = document.getElementById('delete-warning-schedules');
|
||||
const deleteBtn = document.getElementById('confirm-delete-btn');
|
||||
|
||||
if (isInUse) {
|
||||
warningDiv.style.display = 'block';
|
||||
deleteBtn.disabled = true;
|
||||
deleteBtn.classList.add('disabled');
|
||||
} else {
|
||||
warningDiv.style.display = 'none';
|
||||
deleteBtn.disabled = false;
|
||||
deleteBtn.classList.remove('disabled');
|
||||
}
|
||||
|
||||
new bootstrap.Modal(document.getElementById('deleteModal')).show();
|
||||
}
|
||||
|
||||
// Delete config
|
||||
async function deleteConfig() {
|
||||
if (!selectedConfigForDeletion) return;
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/configs/${selectedConfigForDeletion}`, {
|
||||
method: 'DELETE'
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.message || `HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
// Hide modal
|
||||
bootstrap.Modal.getInstance(document.getElementById('deleteModal')).hide();
|
||||
|
||||
// Reload configs
|
||||
await loadConfigs();
|
||||
|
||||
// Show success message
|
||||
showAlert('success', `Config "${selectedConfigForDeletion}" deleted successfully`);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error deleting config:', error);
|
||||
showAlert('danger', `Error deleting config: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Show alert
|
||||
function showAlert(type, message) {
|
||||
const alertHtml = `
|
||||
<div class="alert alert-${type} alert-dismissible fade show mt-3" role="alert">
|
||||
${message}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
</div>
|
||||
`;
|
||||
|
||||
const container = document.querySelector('.container-fluid');
|
||||
container.insertAdjacentHTML('afterbegin', alertHtml);
|
||||
|
||||
// Auto-dismiss after 5 seconds
|
||||
setTimeout(() => {
|
||||
const alert = container.querySelector('.alert');
|
||||
if (alert) {
|
||||
bootstrap.Alert.getInstance(alert)?.close();
|
||||
}
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
// Search filter
|
||||
// Search functionality
|
||||
document.getElementById('search-input').addEventListener('input', function(e) {
|
||||
const searchTerm = e.target.value.toLowerCase();
|
||||
renderConfigs(e.target.value);
|
||||
});
|
||||
|
||||
if (!searchTerm) {
|
||||
renderConfigs(configsData);
|
||||
// Create config
|
||||
document.getElementById('create-config-btn').addEventListener('click', async function() {
|
||||
const title = document.getElementById('config-title').value.trim();
|
||||
const description = document.getElementById('config-description').value.trim();
|
||||
const siteCheckboxes = document.querySelectorAll('.site-checkbox:checked');
|
||||
const siteIds = Array.from(siteCheckboxes).map(cb => parseInt(cb.value));
|
||||
|
||||
if (!title) {
|
||||
showError('create-config-error', 'Title is required');
|
||||
return;
|
||||
}
|
||||
|
||||
const filtered = configsData.filter(config =>
|
||||
config.filename.toLowerCase().includes(searchTerm) ||
|
||||
(config.title && config.title.toLowerCase().includes(searchTerm))
|
||||
);
|
||||
if (siteIds.length === 0) {
|
||||
showError('create-config-error', 'At least one site must be selected');
|
||||
return;
|
||||
}
|
||||
|
||||
renderConfigs(filtered);
|
||||
try {
|
||||
const response = await fetch('/api/configs', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ title, description: description || null, site_ids: siteIds })
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json();
|
||||
throw new Error(data.message || 'Failed to create config');
|
||||
}
|
||||
|
||||
// Close modal and reload
|
||||
bootstrap.Modal.getInstance(document.getElementById('createConfigModal')).hide();
|
||||
document.getElementById('create-config-form').reset();
|
||||
renderSitesCheckboxes(); // Reset checkboxes
|
||||
await loadConfigs();
|
||||
} catch (error) {
|
||||
showError('create-config-error', error.message);
|
||||
}
|
||||
});
|
||||
|
||||
// Setup delete button
|
||||
document.getElementById('confirm-delete-btn').addEventListener('click', deleteConfig);
|
||||
// View config
|
||||
async function viewConfig(id) {
|
||||
try {
|
||||
const response = await fetch(`/api/configs/${id}`);
|
||||
if (!response.ok) throw new Error('Failed to load config');
|
||||
|
||||
// Load configs on page load
|
||||
document.addEventListener('DOMContentLoaded', loadConfigs);
|
||||
const config = await response.json();
|
||||
|
||||
let html = `
|
||||
<div class="mb-3">
|
||||
<strong>Title:</strong> ${escapeHtml(config.title)}
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<strong>Description:</strong> ${config.description ? escapeHtml(config.description) : '<span class="text-muted">None</span>'}
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<strong>Sites (${config.site_count}):</strong>
|
||||
<ul class="mt-2">
|
||||
${config.sites.map(site => `
|
||||
<li>${escapeHtml(site.name)} <small class="text-muted">(${site.ip_count} IP${site.ip_count !== 1 ? 's' : ''})</small></li>
|
||||
`).join('')}
|
||||
</ul>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<strong>Created:</strong> ${formatDate(config.created_at)}
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<strong>Last Updated:</strong> ${formatDate(config.updated_at)}
|
||||
</div>
|
||||
`;
|
||||
|
||||
document.getElementById('view-config-content').innerHTML = html;
|
||||
new bootstrap.Modal(document.getElementById('viewConfigModal')).show();
|
||||
} catch (error) {
|
||||
alert('Error loading config: ' + error.message);
|
||||
}
|
||||
}
|
||||
|
||||
// Edit config
|
||||
async function editConfig(id) {
|
||||
try {
|
||||
const response = await fetch(`/api/configs/${id}`);
|
||||
if (!response.ok) throw new Error('Failed to load config');
|
||||
|
||||
const config = await response.json();
|
||||
|
||||
document.getElementById('edit-config-id').value = config.id;
|
||||
document.getElementById('edit-config-title').value = config.title;
|
||||
document.getElementById('edit-config-description').value = config.description || '';
|
||||
|
||||
const selectedIds = config.sites.map(s => s.id);
|
||||
renderSitesCheckboxes(selectedIds, true); // true = isEditMode
|
||||
|
||||
new bootstrap.Modal(document.getElementById('editConfigModal')).show();
|
||||
} catch (error) {
|
||||
alert('Error loading config: ' + error.message);
|
||||
}
|
||||
}
|
||||
|
||||
// Save edited config
|
||||
document.getElementById('edit-config-btn').addEventListener('click', async function() {
|
||||
const id = document.getElementById('edit-config-id').value;
|
||||
const title = document.getElementById('edit-config-title').value.trim();
|
||||
const description = document.getElementById('edit-config-description').value.trim();
|
||||
const siteCheckboxes = document.querySelectorAll('.edit-site-checkbox:checked');
|
||||
const siteIds = Array.from(siteCheckboxes).map(cb => parseInt(cb.value));
|
||||
|
||||
if (!title) {
|
||||
showError('edit-config-error', 'Title is required');
|
||||
return;
|
||||
}
|
||||
|
||||
if (siteIds.length === 0) {
|
||||
showError('edit-config-error', 'At least one site must be selected');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/configs/${id}`, {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ title, description: description || null, site_ids: siteIds })
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json();
|
||||
throw new Error(data.message || 'Failed to update config');
|
||||
}
|
||||
|
||||
// Close modal and reload
|
||||
bootstrap.Modal.getInstance(document.getElementById('editConfigModal')).hide();
|
||||
await loadConfigs();
|
||||
} catch (error) {
|
||||
showError('edit-config-error', error.message);
|
||||
}
|
||||
});
|
||||
|
||||
// Delete config
|
||||
function deleteConfig(id, name) {
|
||||
document.getElementById('delete-config-id').value = id;
|
||||
document.getElementById('delete-config-name').textContent = name;
|
||||
new bootstrap.Modal(document.getElementById('deleteConfigModal')).show();
|
||||
}
|
||||
|
||||
// Confirm delete
|
||||
document.getElementById('confirm-delete-btn').addEventListener('click', async function() {
|
||||
const id = document.getElementById('delete-config-id').value;
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/configs/${id}`, { method: 'DELETE' });
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json();
|
||||
throw new Error(data.message || 'Failed to delete config');
|
||||
}
|
||||
|
||||
// Close modal and reload
|
||||
bootstrap.Modal.getInstance(document.getElementById('deleteConfigModal')).hide();
|
||||
await loadConfigs();
|
||||
} catch (error) {
|
||||
alert('Error deleting config: ' + error.message);
|
||||
}
|
||||
});
|
||||
|
||||
// Utility functions
|
||||
function showError(elementId, message) {
|
||||
const errorEl = document.getElementById(elementId);
|
||||
const messageEl = document.getElementById(elementId + '-message');
|
||||
messageEl.textContent = message;
|
||||
errorEl.style.display = 'block';
|
||||
}
|
||||
|
||||
function escapeHtml(text) {
|
||||
if (!text) return '';
|
||||
const div = document.createElement('div');
|
||||
div.textContent = text;
|
||||
return div.innerHTML;
|
||||
}
|
||||
|
||||
function formatDate(dateStr) {
|
||||
if (!dateStr) return '-';
|
||||
const date = new Date(dateStr);
|
||||
return date.toLocaleDateString() + ' ' + date.toLocaleTimeString();
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12">
|
||||
<h1 class="mb-4" style="color: #60a5fa;">Dashboard</h1>
|
||||
<h1 class="mb-4">Dashboard</h1>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -42,7 +42,7 @@
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">Quick Actions</h5>
|
||||
<h5 class="mb-0">Quick Actions</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<button class="btn btn-primary btn-lg" onclick="showTriggerScanModal()">
|
||||
@@ -63,7 +63,7 @@
|
||||
<div class="col-md-8">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">Scan Activity (Last 30 Days)</h5>
|
||||
<h5 class="mb-0">Scan Activity (Last 30 Days)</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div id="chart-loading" class="text-center py-4">
|
||||
@@ -80,7 +80,7 @@
|
||||
<div class="col-md-4">
|
||||
<div class="card h-100">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">Upcoming Schedules</h5>
|
||||
<h5 class="mb-0">Upcoming Schedules</h5>
|
||||
<a href="{{ url_for('main.schedules') }}" class="btn btn-sm btn-secondary">Manage</a>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
@@ -105,7 +105,7 @@
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">Recent Scans</h5>
|
||||
<h5 class="mb-0">Recent Scans</h5>
|
||||
<button class="btn btn-sm btn-secondary" onclick="refreshScans()">
|
||||
<span id="refresh-text">Refresh</span>
|
||||
<span id="refresh-spinner" class="spinner-border spinner-border-sm ms-1" style="display: none;"></span>
|
||||
@@ -145,42 +145,36 @@
|
||||
<!-- Trigger Scan Modal -->
|
||||
<div class="modal fade" id="triggerScanModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content" style="background-color: #1e293b; border: 1px solid #334155;">
|
||||
<div class="modal-header" style="border-bottom: 1px solid #334155;">
|
||||
<h5 class="modal-title" style="color: #60a5fa;">Trigger New Scan</h5>
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Trigger New Scan</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form id="trigger-scan-form">
|
||||
<div class="mb-3">
|
||||
<label for="config-file" class="form-label">Config File</label>
|
||||
<select class="form-select" id="config-file" name="config_file" required {% if not config_files %}disabled{% endif %}>
|
||||
<option value="">Select a config file...</option>
|
||||
{% for config in config_files %}
|
||||
<option value="{{ config }}">{{ config }}</option>
|
||||
{% endfor %}
|
||||
<label for="config-select" class="form-label">Scan Configuration</label>
|
||||
<select class="form-select" id="config-select" name="config_id" required>
|
||||
<option value="">Loading configurations...</option>
|
||||
</select>
|
||||
{% if config_files %}
|
||||
<div class="form-text text-muted">
|
||||
Select a scan configuration file
|
||||
<div class="form-text text-muted" id="config-help-text">
|
||||
Select a scan configuration
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="alert alert-warning mt-2 mb-0" role="alert">
|
||||
<div id="no-configs-warning" class="alert alert-warning mt-2 mb-0" role="alert" style="display: none;">
|
||||
<i class="bi bi-exclamation-triangle"></i>
|
||||
<strong>No configurations available</strong>
|
||||
<p class="mb-2 mt-2">You need to create a configuration file before you can trigger a scan.</p>
|
||||
<a href="{{ url_for('main.upload_config') }}" class="btn btn-sm btn-primary">
|
||||
<p class="mb-2 mt-2">You need to create a configuration before you can trigger a scan.</p>
|
||||
<a href="{{ url_for('main.configs') }}" class="btn btn-sm btn-primary">
|
||||
<i class="bi bi-plus-circle"></i> Create Configuration
|
||||
</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div id="trigger-error" class="alert alert-danger" style="display: none;"></div>
|
||||
</form>
|
||||
</div>
|
||||
<div class="modal-footer" style="border-top: 1px solid #334155;">
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
|
||||
<button type="button" class="btn btn-primary" onclick="triggerScan()" {% if not config_files %}disabled{% endif %}>
|
||||
<button type="button" class="btn btn-primary" id="trigger-scan-btn" onclick="triggerScan()">
|
||||
<span id="modal-trigger-text">Trigger Scan</span>
|
||||
<span id="modal-trigger-spinner" class="spinner-border spinner-border-sm ms-2" style="display: none;"></span>
|
||||
</button>
|
||||
@@ -323,23 +317,75 @@
|
||||
});
|
||||
}
|
||||
|
||||
// Load available configs
|
||||
async function loadConfigs() {
|
||||
const selectEl = document.getElementById('config-select');
|
||||
const helpTextEl = document.getElementById('config-help-text');
|
||||
const noConfigsWarning = document.getElementById('no-configs-warning');
|
||||
const triggerBtn = document.getElementById('trigger-scan-btn');
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/configs');
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load configurations');
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
const configs = data.configs || [];
|
||||
|
||||
// Clear existing options
|
||||
selectEl.innerHTML = '';
|
||||
|
||||
if (configs.length === 0) {
|
||||
selectEl.innerHTML = '<option value="">No configurations available</option>';
|
||||
selectEl.disabled = true;
|
||||
triggerBtn.disabled = true;
|
||||
helpTextEl.style.display = 'none';
|
||||
noConfigsWarning.style.display = 'block';
|
||||
} else {
|
||||
selectEl.innerHTML = '<option value="">Select a configuration...</option>';
|
||||
configs.forEach(config => {
|
||||
const option = document.createElement('option');
|
||||
option.value = config.id;
|
||||
const siteText = config.site_count === 1 ? 'site' : 'sites';
|
||||
option.textContent = `${config.title} (${config.site_count} ${siteText})`;
|
||||
selectEl.appendChild(option);
|
||||
});
|
||||
selectEl.disabled = false;
|
||||
triggerBtn.disabled = false;
|
||||
helpTextEl.style.display = 'block';
|
||||
noConfigsWarning.style.display = 'none';
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading configs:', error);
|
||||
selectEl.innerHTML = '<option value="">Error loading configurations</option>';
|
||||
selectEl.disabled = true;
|
||||
triggerBtn.disabled = true;
|
||||
helpTextEl.style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
// Show trigger scan modal
|
||||
function showTriggerScanModal() {
|
||||
const modal = new bootstrap.Modal(document.getElementById('triggerScanModal'));
|
||||
document.getElementById('trigger-error').style.display = 'none';
|
||||
document.getElementById('trigger-scan-form').reset();
|
||||
|
||||
// Load configs when modal is shown
|
||||
loadConfigs();
|
||||
|
||||
modal.show();
|
||||
}
|
||||
|
||||
// Trigger scan
|
||||
async function triggerScan() {
|
||||
const configFile = document.getElementById('config-file').value;
|
||||
const configId = document.getElementById('config-select').value;
|
||||
const errorEl = document.getElementById('trigger-error');
|
||||
const btnText = document.getElementById('modal-trigger-text');
|
||||
const btnSpinner = document.getElementById('modal-trigger-spinner');
|
||||
|
||||
if (!configFile) {
|
||||
errorEl.textContent = 'Please enter a config file path.';
|
||||
if (!configId) {
|
||||
errorEl.textContent = 'Please select a configuration.';
|
||||
errorEl.style.display = 'block';
|
||||
return;
|
||||
}
|
||||
@@ -356,7 +402,7 @@
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
config_file: configFile
|
||||
config_id: parseInt(configId)
|
||||
})
|
||||
});
|
||||
|
||||
|
||||
375
app/web/templates/help.html
Normal file
375
app/web/templates/help.html
Normal file
@@ -0,0 +1,375 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Help - SneakyScanner{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12">
|
||||
<h1 class="mb-4"><i class="bi bi-question-circle"></i> Help & Documentation</h1>
|
||||
<p class="text-muted">Learn how to use SneakyScanner to manage your network scanning operations.</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Quick Navigation -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="bi bi-compass"></i> Quick Navigation</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="row g-2">
|
||||
<div class="col-md-3 col-6">
|
||||
<a href="#getting-started" class="btn btn-outline-primary w-100">Getting Started</a>
|
||||
</div>
|
||||
<div class="col-md-3 col-6">
|
||||
<a href="#sites" class="btn btn-outline-primary w-100">Sites</a>
|
||||
</div>
|
||||
<div class="col-md-3 col-6">
|
||||
<a href="#scan-configs" class="btn btn-outline-primary w-100">Scan Configs</a>
|
||||
</div>
|
||||
<div class="col-md-3 col-6">
|
||||
<a href="#running-scans" class="btn btn-outline-primary w-100">Running Scans</a>
|
||||
</div>
|
||||
<div class="col-md-3 col-6">
|
||||
<a href="#scheduling" class="btn btn-outline-primary w-100">Scheduling</a>
|
||||
</div>
|
||||
<div class="col-md-3 col-6">
|
||||
<a href="#comparisons" class="btn btn-outline-primary w-100">Comparisons</a>
|
||||
</div>
|
||||
<div class="col-md-3 col-6">
|
||||
<a href="#alerts" class="btn btn-outline-primary w-100">Alerts</a>
|
||||
</div>
|
||||
<div class="col-md-3 col-6">
|
||||
<a href="#webhooks" class="btn btn-outline-primary w-100">Webhooks</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Getting Started -->
|
||||
<div class="row mb-4" id="getting-started">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="bi bi-rocket-takeoff"></i> Getting Started</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<p>SneakyScanner helps you perform network vulnerability scans and track changes over time. Here's the typical workflow:</p>
|
||||
|
||||
<div class="alert alert-info">
|
||||
<strong>Basic Workflow:</strong>
|
||||
<ol class="mb-0 mt-2">
|
||||
<li><strong>Create a Site</strong> - Define a logical grouping for your targets</li>
|
||||
<li><strong>Add IPs</strong> - Add IP addresses or ranges to your site</li>
|
||||
<li><strong>Create a Scan Config</strong> - Configure how scans should run using your site</li>
|
||||
<li><strong>Run a Scan</strong> - Execute scans manually or on a schedule</li>
|
||||
<li><strong>Review Results</strong> - Analyze findings and compare scans over time</li>
|
||||
</ol>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Sites -->
|
||||
<div class="row mb-4" id="sites">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="bi bi-globe"></i> Creating Sites & Adding IPs</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<h6>What is a Site?</h6>
|
||||
<p>A Site is a logical grouping of IP addresses that you want to scan together. For example, you might create separate sites for "Production Servers", "Development Environment", or "Office Network".</p>
|
||||
|
||||
<h6>Creating a Site</h6>
|
||||
<ol>
|
||||
<li>Navigate to <strong>Configs → Sites</strong> in the navigation menu</li>
|
||||
<li>Click the <strong>Create Site</strong> button</li>
|
||||
<li>Enter a descriptive name for your site</li>
|
||||
<li>Optionally add a description to help identify the site's purpose</li>
|
||||
<li>Click <strong>Create</strong> to save the site</li>
|
||||
</ol>
|
||||
|
||||
<h6>Adding IP Addresses</h6>
|
||||
<p>After creating a site, you need to add the IP addresses you want to scan:</p>
|
||||
<ol>
|
||||
<li>Find your site in the Sites list</li>
|
||||
<li>Click the <strong>Manage IPs</strong> button (or the site name)</li>
|
||||
<li>Click <strong>Add IP</strong></li>
|
||||
<li>Enter the IP address or CIDR range (e.g., <code>192.168.1.1</code> or <code>192.168.1.0/24</code>)</li>
|
||||
<li>Click <strong>Add</strong> to save</li>
|
||||
</ol>
|
||||
|
||||
<div class="alert alert-warning">
|
||||
<i class="bi bi-exclamation-triangle"></i> <strong>Note:</strong> You can add individual IPs or CIDR notation ranges. Large ranges will result in longer scan times.
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Scan Configs -->
|
||||
<div class="row mb-4" id="scan-configs">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="bi bi-gear"></i> Creating Scan Configurations</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<h6>What is a Scan Config?</h6>
|
||||
<p>A Scan Configuration defines how a scan should be performed. It links to a Site and specifies scanning parameters like ports to scan, timing options, and other settings.</p>
|
||||
|
||||
<h6>Creating a Scan Config</h6>
|
||||
<ol>
|
||||
<li>Navigate to <strong>Configs → Scan Configs</strong> in the navigation menu</li>
|
||||
<li>Click the <strong>Create Config</strong> button</li>
|
||||
<li>Enter a name for the configuration</li>
|
||||
<li>Select the <strong>Site</strong> to associate with this config</li>
|
||||
<li>Configure scan parameters:
|
||||
<ul>
|
||||
<li><strong>Ports</strong> - Specify ports to scan (e.g., <code>22,80,443</code> or <code>1-1000</code>)</li>
|
||||
<li><strong>Timing</strong> - Set scan speed/aggressiveness</li>
|
||||
<li><strong>Additional Options</strong> - Configure other nmap parameters as needed</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Click <strong>Create</strong> to save the configuration</li>
|
||||
</ol>
|
||||
|
||||
<div class="alert alert-info">
|
||||
<i class="bi bi-info-circle"></i> <strong>Tip:</strong> Create different configs for different purposes - a quick config for daily checks and a thorough config for weekly deep scans.
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Running Scans -->
|
||||
<div class="row mb-4" id="running-scans">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="bi bi-play-circle"></i> Running Scans</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<h6>Starting a Manual Scan</h6>
|
||||
<ol>
|
||||
<li>Navigate to <strong>Scans</strong> in the navigation menu</li>
|
||||
<li>Click the <strong>New Scan</strong> button</li>
|
||||
<li>Select the <strong>Scan Config</strong> you want to use</li>
|
||||
<li>Click <strong>Start Scan</strong></li>
|
||||
</ol>
|
||||
|
||||
<h6>Monitoring Scan Progress</h6>
|
||||
<p>While a scan is running:</p>
|
||||
<ul>
|
||||
<li>The scan will appear in the Scans list with a <span class="badge badge-warning">Running</span> status</li>
|
||||
<li>You can view live progress by clicking on the scan</li>
|
||||
<li>The Dashboard also shows active scans</li>
|
||||
</ul>
|
||||
|
||||
<h6>Viewing Scan Results</h6>
|
||||
<ol>
|
||||
<li>Once complete, click on a scan in the Scans list</li>
|
||||
<li>View discovered hosts, open ports, and services</li>
|
||||
<li>Export results or compare with previous scans</li>
|
||||
</ol>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Scheduling -->
|
||||
<div class="row mb-4" id="scheduling">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="bi bi-calendar-check"></i> Scheduling Scans</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<h6>Why Schedule Scans?</h6>
|
||||
<p>Scheduled scans allow you to automatically run scans at regular intervals, ensuring continuous monitoring of your network without manual intervention.</p>
|
||||
|
||||
<h6>Creating a Schedule</h6>
|
||||
<ol>
|
||||
<li>Navigate to <strong>Schedules</strong> in the navigation menu</li>
|
||||
<li>Click the <strong>Create Schedule</strong> button</li>
|
||||
<li>Enter a name for the schedule</li>
|
||||
<li>Select the <strong>Scan Config</strong> to use</li>
|
||||
<li>Configure the schedule:
|
||||
<ul>
|
||||
<li><strong>Frequency</strong> - How often to run (daily, weekly, monthly, custom cron)</li>
|
||||
<li><strong>Time</strong> - When to start the scan</li>
|
||||
<li><strong>Days</strong> - Which days to run (for weekly schedules)</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Enable/disable the schedule as needed</li>
|
||||
<li>Click <strong>Create</strong> to save</li>
|
||||
</ol>
|
||||
|
||||
<h6>Managing Schedules</h6>
|
||||
<ul>
|
||||
<li><strong>Enable/Disable</strong> - Toggle schedules on or off without deleting them</li>
|
||||
<li><strong>Edit</strong> - Modify the schedule timing or associated config</li>
|
||||
<li><strong>Delete</strong> - Remove schedules you no longer need</li>
|
||||
<li><strong>View History</strong> - See past runs triggered by the schedule</li>
|
||||
</ul>
|
||||
|
||||
<div class="alert alert-info">
|
||||
<i class="bi bi-info-circle"></i> <strong>Tip:</strong> Schedule comprehensive scans during off-peak hours to minimize network impact.
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Scan Comparisons -->
|
||||
<div class="row mb-4" id="comparisons">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="bi bi-arrow-left-right"></i> Scan Comparisons</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<h6>Why Compare Scans?</h6>
|
||||
<p>Comparing scans helps you identify changes in your network over time - new hosts, closed ports, new services, or potential security issues.</p>
|
||||
|
||||
<h6>Comparing Two Scans</h6>
|
||||
<ol>
|
||||
<li>Navigate to <strong>Scans</strong> in the navigation menu</li>
|
||||
<li>Find the scan you want to use as the baseline</li>
|
||||
<li>Click on the scan to view its details</li>
|
||||
<li>Click the <strong>Compare</strong> button</li>
|
||||
<li>Select another scan to compare against</li>
|
||||
<li>Review the comparison results</li>
|
||||
</ol>
|
||||
|
||||
<h6>Understanding Comparison Results</h6>
|
||||
<p>The comparison view shows:</p>
|
||||
<ul>
|
||||
<li><span class="badge badge-success">New</span> - Hosts or ports that appear in the newer scan but not the older one</li>
|
||||
<li><span class="badge badge-danger">Removed</span> - Hosts or ports that were in the older scan but not the newer one</li>
|
||||
<li><span class="badge badge-warning">Changed</span> - Services or states that differ between scans</li>
|
||||
<li><span class="badge badge-info">Unchanged</span> - Items that remain the same</li>
|
||||
</ul>
|
||||
|
||||
<div class="alert alert-warning">
|
||||
<i class="bi bi-exclamation-triangle"></i> <strong>Security Note:</strong> Pay close attention to unexpected new open ports or services - these could indicate unauthorized changes or potential compromises.
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Alerts -->
|
||||
<div class="row mb-4" id="alerts">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="bi bi-bell"></i> Alerts & Alert Rules</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<h6>Understanding Alerts</h6>
|
||||
<p>Alerts notify you when scan results match certain conditions you define. This helps you stay informed about important changes without manually reviewing every scan.</p>
|
||||
|
||||
<h6>Viewing Alert History</h6>
|
||||
<ol>
|
||||
<li>Navigate to <strong>Alerts → Alert History</strong></li>
|
||||
<li>View all triggered alerts with timestamps and details</li>
|
||||
<li>Filter alerts by severity, date, or type</li>
|
||||
<li>Click on an alert to see full details and the associated scan</li>
|
||||
</ol>
|
||||
|
||||
<h6>Creating Alert Rules</h6>
|
||||
<ol>
|
||||
<li>Navigate to <strong>Alerts → Alert Rules</strong></li>
|
||||
<li>Click <strong>Create Rule</strong></li>
|
||||
<li>Configure the rule:
|
||||
<ul>
|
||||
<li><strong>Name</strong> - A descriptive name for the rule</li>
|
||||
<li><strong>Condition</strong> - What triggers the alert (e.g., new open port, new host, specific service detected)</li>
|
||||
<li><strong>Severity</strong> - How critical is this alert (Info, Warning, Critical)</li>
|
||||
<li><strong>Scope</strong> - Which sites or configs this rule applies to</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Enable the rule</li>
|
||||
<li>Click <strong>Create</strong> to save</li>
|
||||
</ol>
|
||||
|
||||
<h6>Common Alert Rule Examples</h6>
|
||||
<ul>
|
||||
<li><strong>New Host Detected</strong> - Alert when a previously unknown host appears</li>
|
||||
<li><strong>New Open Port</strong> - Alert when a new port opens on any host</li>
|
||||
<li><strong>Critical Port Open</strong> - Alert for specific high-risk ports (e.g., 23/Telnet, 3389/RDP)</li>
|
||||
<li><strong>Service Change</strong> - Alert when a service version changes</li>
|
||||
<li><strong>Host Offline</strong> - Alert when an expected host stops responding</li>
|
||||
</ul>
|
||||
|
||||
<div class="alert alert-info">
|
||||
<i class="bi bi-info-circle"></i> <strong>Tip:</strong> Start with a few important rules and refine them over time to avoid alert fatigue.
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Webhooks -->
|
||||
<div class="row mb-4" id="webhooks">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0"><i class="bi bi-broadcast"></i> Webhooks</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<h6>What are Webhooks?</h6>
|
||||
<p>Webhooks allow SneakyScanner to send notifications to external services when events occur, such as scan completion or alert triggers. This enables integration with tools like Slack, Discord, Microsoft Teams, or custom systems.</p>
|
||||
|
||||
<h6>Creating a Webhook</h6>
|
||||
<ol>
|
||||
<li>Navigate to <strong>Alerts → Webhooks</strong></li>
|
||||
<li>Click <strong>Create Webhook</strong></li>
|
||||
<li>Configure the webhook:
|
||||
<ul>
|
||||
<li><strong>Name</strong> - A descriptive name</li>
|
||||
<li><strong>URL</strong> - The endpoint to send notifications to</li>
|
||||
<li><strong>Events</strong> - Which events trigger this webhook</li>
|
||||
<li><strong>Secret</strong> - Optional secret for request signing</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Test the webhook to verify it works</li>
|
||||
<li>Click <strong>Create</strong> to save</li>
|
||||
</ol>
|
||||
|
||||
<h6>Webhook Events</h6>
|
||||
<ul>
|
||||
<li><strong>Scan Started</strong> - When a scan begins</li>
|
||||
<li><strong>Scan Completed</strong> - When a scan finishes</li>
|
||||
<li><strong>Scan Failed</strong> - When a scan encounters an error</li>
|
||||
<li><strong>Alert Triggered</strong> - When an alert rule matches</li>
|
||||
</ul>
|
||||
|
||||
<h6>Integration Examples</h6>
|
||||
<ul>
|
||||
<li><strong>Slack</strong> - Use a Slack Incoming Webhook URL</li>
|
||||
<li><strong>Discord</strong> - Use a Discord Webhook URL</li>
|
||||
<li><strong>Microsoft Teams</strong> - Use a Teams Incoming Webhook</li>
|
||||
<li><strong>Custom API</strong> - Send to your own endpoint for custom processing</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Back to Top -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-12 text-center">
|
||||
<a href="#" class="btn btn-outline-secondary">
|
||||
<i class="bi bi-arrow-up"></i> Back to Top
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
175
app/web/templates/ip_search_results.html
Normal file
175
app/web/templates/ip_search_results.html
Normal file
@@ -0,0 +1,175 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Search Results for {{ ip_address }} - SneakyScanner{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12 d-flex justify-content-between align-items-center mb-4">
|
||||
<h1>
|
||||
<i class="bi bi-search"></i>
|
||||
Search Results
|
||||
{% if ip_address %}
|
||||
<small class="text-muted">for {{ ip_address }}</small>
|
||||
{% endif %}
|
||||
</h1>
|
||||
<a href="{{ url_for('main.scans') }}" class="btn btn-secondary">
|
||||
<i class="bi bi-arrow-left"></i> Back to Scans
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% if not ip_address %}
|
||||
<!-- No IP provided -->
|
||||
<div class="row">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-body text-center py-5">
|
||||
<i class="bi bi-exclamation-circle text-warning" style="font-size: 3rem;"></i>
|
||||
<h4 class="mt-3">No IP Address Provided</h4>
|
||||
<p class="text-muted">Please enter an IP address in the search box to find related scans.</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% else %}
|
||||
<!-- Results Table -->
|
||||
<div class="row">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0">Last 10 Scans Containing {{ ip_address }}</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div id="results-loading" class="text-center py-5">
|
||||
<div class="spinner-border" role="status">
|
||||
<span class="visually-hidden">Loading...</span>
|
||||
</div>
|
||||
<p class="mt-3 text-muted">Searching for scans...</p>
|
||||
</div>
|
||||
<div id="results-error" class="alert alert-danger" style="display: none;"></div>
|
||||
<div id="results-empty" class="text-center py-5 text-muted" style="display: none;">
|
||||
<i class="bi bi-search" style="font-size: 3rem;"></i>
|
||||
<h5 class="mt-3">No Scans Found</h5>
|
||||
<p>No completed scans contain the IP address <strong>{{ ip_address }}</strong>.</p>
|
||||
</div>
|
||||
<div id="results-table-container" style="display: none;">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th style="width: 80px;">ID</th>
|
||||
<th>Title</th>
|
||||
<th style="width: 200px;">Timestamp</th>
|
||||
<th style="width: 100px;">Duration</th>
|
||||
<th style="width: 120px;">Status</th>
|
||||
<th style="width: 100px;">Actions</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="results-tbody">
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<div class="text-muted mt-3">
|
||||
Found <span id="result-count">0</span> scan(s) containing this IP address.
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
const ipAddress = "{{ ip_address | e }}";
|
||||
|
||||
// Load results when page loads
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
if (ipAddress) {
|
||||
loadResults();
|
||||
}
|
||||
});
|
||||
|
||||
// Load search results from API
|
||||
async function loadResults() {
|
||||
const loadingEl = document.getElementById('results-loading');
|
||||
const errorEl = document.getElementById('results-error');
|
||||
const emptyEl = document.getElementById('results-empty');
|
||||
const tableEl = document.getElementById('results-table-container');
|
||||
|
||||
// Show loading state
|
||||
loadingEl.style.display = 'block';
|
||||
errorEl.style.display = 'none';
|
||||
emptyEl.style.display = 'none';
|
||||
tableEl.style.display = 'none';
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/scans/by-ip/${encodeURIComponent(ipAddress)}`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to search for scans');
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
const scans = data.scans || [];
|
||||
|
||||
loadingEl.style.display = 'none';
|
||||
|
||||
if (scans.length === 0) {
|
||||
emptyEl.style.display = 'block';
|
||||
} else {
|
||||
tableEl.style.display = 'block';
|
||||
renderResultsTable(scans);
|
||||
document.getElementById('result-count').textContent = data.count;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error searching for scans:', error);
|
||||
loadingEl.style.display = 'none';
|
||||
errorEl.textContent = 'Failed to search for scans. Please try again.';
|
||||
errorEl.style.display = 'block';
|
||||
}
|
||||
}
|
||||
|
||||
// Render results table
|
||||
function renderResultsTable(scans) {
|
||||
const tbody = document.getElementById('results-tbody');
|
||||
tbody.innerHTML = '';
|
||||
|
||||
scans.forEach(scan => {
|
||||
const row = document.createElement('tr');
|
||||
row.classList.add('scan-row');
|
||||
|
||||
// Format timestamp
|
||||
const timestamp = new Date(scan.timestamp).toLocaleString();
|
||||
|
||||
// Format duration
|
||||
const duration = scan.duration ? `${scan.duration.toFixed(1)}s` : '-';
|
||||
|
||||
// Status badge
|
||||
let statusBadge = '';
|
||||
if (scan.status === 'completed') {
|
||||
statusBadge = '<span class="badge badge-success">Completed</span>';
|
||||
} else if (scan.status === 'running') {
|
||||
statusBadge = '<span class="badge badge-info">Running</span>';
|
||||
} else if (scan.status === 'failed') {
|
||||
statusBadge = '<span class="badge badge-danger">Failed</span>';
|
||||
} else {
|
||||
statusBadge = `<span class="badge badge-info">${scan.status}</span>`;
|
||||
}
|
||||
|
||||
row.innerHTML = `
|
||||
<td class="mono">${scan.id}</td>
|
||||
<td>${scan.title || 'Untitled Scan'}</td>
|
||||
<td class="text-muted">${timestamp}</td>
|
||||
<td class="mono">${duration}</td>
|
||||
<td>${statusBadge}</td>
|
||||
<td>
|
||||
<a href="/scans/${scan.id}" class="btn btn-sm btn-secondary">View</a>
|
||||
</td>
|
||||
`;
|
||||
|
||||
tbody.appendChild(row);
|
||||
});
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -375,12 +375,12 @@
|
||||
document.getElementById('scan1-id').textContent = data.scan1.id;
|
||||
document.getElementById('scan1-title').textContent = data.scan1.title || 'Untitled Scan';
|
||||
document.getElementById('scan1-timestamp').textContent = new Date(data.scan1.timestamp).toLocaleString();
|
||||
document.getElementById('scan1-config').textContent = data.scan1.config_file || 'Unknown';
|
||||
document.getElementById('scan1-config').textContent = data.scan1.config_id || 'Unknown';
|
||||
|
||||
document.getElementById('scan2-id').textContent = data.scan2.id;
|
||||
document.getElementById('scan2-title').textContent = data.scan2.title || 'Untitled Scan';
|
||||
document.getElementById('scan2-timestamp').textContent = new Date(data.scan2.timestamp).toLocaleString();
|
||||
document.getElementById('scan2-config').textContent = data.scan2.config_file || 'Unknown';
|
||||
document.getElementById('scan2-config').textContent = data.scan2.config_id || 'Unknown';
|
||||
|
||||
// Ports comparison
|
||||
populatePortsComparison(data.ports);
|
||||
|
||||
@@ -20,6 +20,10 @@
|
||||
<span id="refresh-text">Refresh</span>
|
||||
<span id="refresh-spinner" class="spinner-border spinner-border-sm ms-1" style="display: none;"></span>
|
||||
</button>
|
||||
<button class="btn btn-warning ms-2" onclick="stopScan()" id="stop-btn" style="display: none;">
|
||||
<span id="stop-text">Stop Scan</span>
|
||||
<span id="stop-spinner" class="spinner-border spinner-border-sm ms-1" style="display: none;"></span>
|
||||
</button>
|
||||
<button class="btn btn-danger ms-2" onclick="deleteScan()" id="delete-btn">Delete Scan</button>
|
||||
</div>
|
||||
</div>
|
||||
@@ -79,13 +83,49 @@
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="row">
|
||||
<div class="col-md-12">
|
||||
<div class="mb-0">
|
||||
<label class="form-label text-muted">Config File</label>
|
||||
<div id="scan-config-file" class="mono">-</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Progress Section (shown when scan is running) -->
|
||||
<div class="row mb-4" id="progress-section" style="display: none;">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">
|
||||
<i class="bi bi-hourglass-split"></i> Scan Progress
|
||||
</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<!-- Phase and Progress Bar -->
|
||||
<div class="mb-3">
|
||||
<div class="d-flex justify-content-between align-items-center mb-2">
|
||||
<span>Current Phase: <strong id="current-phase">Initializing...</strong></span>
|
||||
<span id="progress-count">0 / 0 IPs</span>
|
||||
</div>
|
||||
<div class="progress" style="height: 20px; background-color: #334155;">
|
||||
<div id="progress-bar" class="progress-bar bg-info" role="progressbar" style="width: 0%"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Per-IP Results Table -->
|
||||
<div class="table-responsive" style="max-height: 400px; overflow-y: auto;">
|
||||
<table class="table table-sm">
|
||||
<thead style="position: sticky; top: 0; background-color: #1e293b;">
|
||||
<tr>
|
||||
<th>Site</th>
|
||||
<th>IP Address</th>
|
||||
<th>Ping</th>
|
||||
<th>TCP Ports</th>
|
||||
<th>UDP Ports</th>
|
||||
<th>Services</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="progress-table-body">
|
||||
<tr><td colspan="6" class="text-center text-muted">Waiting for results...</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@@ -162,6 +202,67 @@
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Certificate Details Modal -->
|
||||
<div class="modal fade" id="certificateModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content" style="background-color: #1e293b; border: 1px solid #334155;">
|
||||
<div class="modal-header" style="border-bottom: 1px solid #334155;">
|
||||
<h5 class="modal-title" style="color: #60a5fa;">
|
||||
<i class="bi bi-shield-lock"></i> Certificate Details
|
||||
</h5>
|
||||
<button type="button" class="btn-close btn-close-white" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div class="row mb-3">
|
||||
<div class="col-md-6">
|
||||
<label class="form-label text-muted">Subject</label>
|
||||
<div id="cert-subject" class="mono" style="word-break: break-all;">-</div>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<label class="form-label text-muted">Issuer</label>
|
||||
<div id="cert-issuer" class="mono" style="word-break: break-all;">-</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="row mb-3">
|
||||
<div class="col-md-4">
|
||||
<label class="form-label text-muted">Valid From</label>
|
||||
<div id="cert-valid-from" class="mono">-</div>
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
<label class="form-label text-muted">Valid Until</label>
|
||||
<div id="cert-valid-until" class="mono">-</div>
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
<label class="form-label text-muted">Days Until Expiry</label>
|
||||
<div id="cert-days-expiry">-</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="row mb-3">
|
||||
<div class="col-md-6">
|
||||
<label class="form-label text-muted">Serial Number</label>
|
||||
<div id="cert-serial" class="mono" style="word-break: break-all;">-</div>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<label class="form-label text-muted">Self-Signed</label>
|
||||
<div id="cert-self-signed">-</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label class="form-label text-muted">Subject Alternative Names (SANs)</label>
|
||||
<div id="cert-sans">-</div>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label class="form-label text-muted">TLS Version Support</label>
|
||||
<div id="cert-tls-versions">-</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer" style="border-top: 1px solid #334155;">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
@@ -169,22 +270,162 @@
|
||||
const scanId = {{ scan_id }};
|
||||
let scanData = null;
|
||||
let historyChart = null; // Store chart instance to prevent duplicates
|
||||
let progressInterval = null; // Store progress polling interval
|
||||
|
||||
// Show alert notification
|
||||
function showAlert(type, message) {
|
||||
const container = document.getElementById('notification-container');
|
||||
const notification = document.createElement('div');
|
||||
notification.className = `alert alert-${type} alert-dismissible fade show mb-2`;
|
||||
|
||||
notification.innerHTML = `
|
||||
${message}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
`;
|
||||
|
||||
container.appendChild(notification);
|
||||
|
||||
// Auto-dismiss after 5 seconds
|
||||
setTimeout(() => {
|
||||
notification.remove();
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
// Load scan on page load
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
loadScan().then(() => {
|
||||
findPreviousScan();
|
||||
loadHistoricalChart();
|
||||
|
||||
// Start progress polling if scan is running
|
||||
if (scanData && scanData.status === 'running') {
|
||||
startProgressPolling();
|
||||
}
|
||||
});
|
||||
|
||||
// Auto-refresh every 10 seconds if scan is running
|
||||
setInterval(function() {
|
||||
if (scanData && scanData.status === 'running') {
|
||||
loadScan();
|
||||
}
|
||||
}, 10000);
|
||||
});
|
||||
|
||||
// Start polling for progress updates
|
||||
function startProgressPolling() {
|
||||
// Show progress section
|
||||
document.getElementById('progress-section').style.display = 'block';
|
||||
|
||||
// Initial load
|
||||
loadProgress();
|
||||
|
||||
// Poll every 3 seconds
|
||||
progressInterval = setInterval(loadProgress, 3000);
|
||||
}
|
||||
|
||||
// Stop polling for progress updates
|
||||
function stopProgressPolling() {
|
||||
if (progressInterval) {
|
||||
clearInterval(progressInterval);
|
||||
progressInterval = null;
|
||||
}
|
||||
// Hide progress section when scan completes
|
||||
document.getElementById('progress-section').style.display = 'none';
|
||||
}
|
||||
|
||||
// Load progress data
|
||||
async function loadProgress() {
|
||||
try {
|
||||
const response = await fetch(`/api/scans/${scanId}/progress`);
|
||||
if (!response.ok) return;
|
||||
|
||||
const progress = await response.json();
|
||||
|
||||
// Check if scan is still running
|
||||
if (progress.status !== 'running') {
|
||||
stopProgressPolling();
|
||||
loadScan(); // Refresh full scan data
|
||||
return;
|
||||
}
|
||||
|
||||
renderProgress(progress);
|
||||
} catch (error) {
|
||||
console.error('Error loading progress:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Render progress data
|
||||
function renderProgress(progress) {
|
||||
// Update phase display
|
||||
const phaseNames = {
|
||||
'pending': 'Initializing',
|
||||
'ping': 'Ping Scan',
|
||||
'tcp_scan': 'TCP Port Scan',
|
||||
'udp_scan': 'UDP Port Scan',
|
||||
'service_detection': 'Service Detection',
|
||||
'http_analysis': 'HTTP/HTTPS Analysis',
|
||||
'completed': 'Completing'
|
||||
};
|
||||
|
||||
const phaseName = phaseNames[progress.current_phase] || progress.current_phase;
|
||||
document.getElementById('current-phase').textContent = phaseName;
|
||||
|
||||
// Update progress count and bar
|
||||
const total = progress.total_ips || 0;
|
||||
const completed = progress.completed_ips || 0;
|
||||
const percent = total > 0 ? Math.round((completed / total) * 100) : 0;
|
||||
|
||||
document.getElementById('progress-count').textContent = `${completed} / ${total} IPs`;
|
||||
document.getElementById('progress-bar').style.width = `${percent}%`;
|
||||
|
||||
// Update progress table
|
||||
const tbody = document.getElementById('progress-table-body');
|
||||
const entries = progress.progress_entries || [];
|
||||
|
||||
if (entries.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="6" class="text-center text-muted">Waiting for results...</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
let html = '';
|
||||
entries.forEach(entry => {
|
||||
// Ping result
|
||||
let pingDisplay = '-';
|
||||
if (entry.ping_result !== null && entry.ping_result !== undefined) {
|
||||
pingDisplay = entry.ping_result
|
||||
? '<span class="badge badge-success">Yes</span>'
|
||||
: '<span class="badge badge-danger">No</span>';
|
||||
}
|
||||
|
||||
// TCP ports
|
||||
const tcpPorts = entry.tcp_ports || [];
|
||||
let tcpDisplay = tcpPorts.length > 0
|
||||
? `<span class="badge bg-info">${tcpPorts.length}</span> <small class="text-muted">${tcpPorts.slice(0, 5).join(', ')}${tcpPorts.length > 5 ? '...' : ''}</small>`
|
||||
: '-';
|
||||
|
||||
// UDP ports
|
||||
const udpPorts = entry.udp_ports || [];
|
||||
let udpDisplay = udpPorts.length > 0
|
||||
? `<span class="badge bg-info">${udpPorts.length}</span>`
|
||||
: '-';
|
||||
|
||||
// Services
|
||||
const services = entry.services || [];
|
||||
let svcDisplay = '-';
|
||||
if (services.length > 0) {
|
||||
const svcNames = services.map(s => s.service || 'unknown').slice(0, 3);
|
||||
svcDisplay = `<span class="badge bg-info">${services.length}</span> <small class="text-muted">${svcNames.join(', ')}${services.length > 3 ? '...' : ''}</small>`;
|
||||
}
|
||||
|
||||
html += `
|
||||
<tr class="scan-row">
|
||||
<td>${entry.site_name || '-'}</td>
|
||||
<td class="mono">${entry.ip_address}</td>
|
||||
<td>${pingDisplay}</td>
|
||||
<td>${tcpDisplay}</td>
|
||||
<td>${udpDisplay}</td>
|
||||
<td>${svcDisplay}</td>
|
||||
</tr>
|
||||
`;
|
||||
});
|
||||
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
// Load scan details
|
||||
async function loadScan() {
|
||||
const loadingEl = document.getElementById('scan-loading');
|
||||
@@ -226,7 +467,6 @@
|
||||
document.getElementById('scan-timestamp').textContent = new Date(scan.timestamp).toLocaleString();
|
||||
document.getElementById('scan-duration').textContent = scan.duration ? `${scan.duration.toFixed(1)}s` : '-';
|
||||
document.getElementById('scan-triggered-by').textContent = scan.triggered_by || 'manual';
|
||||
document.getElementById('scan-config-file').textContent = scan.config_file || '-';
|
||||
|
||||
// Status badge
|
||||
let statusBadge = '';
|
||||
@@ -235,8 +475,11 @@
|
||||
} else if (scan.status === 'running') {
|
||||
statusBadge = '<span class="badge badge-info">Running</span>';
|
||||
document.getElementById('delete-btn').disabled = true;
|
||||
document.getElementById('stop-btn').style.display = 'inline-block';
|
||||
} else if (scan.status === 'failed') {
|
||||
statusBadge = '<span class="badge badge-danger">Failed</span>';
|
||||
} else if (scan.status === 'cancelled') {
|
||||
statusBadge = '<span class="badge badge-warning">Cancelled</span>';
|
||||
} else {
|
||||
statusBadge = `<span class="badge badge-info">${scan.status}</span>`;
|
||||
}
|
||||
@@ -321,6 +564,8 @@
|
||||
<th>Product</th>
|
||||
<th>Version</th>
|
||||
<th>Status</th>
|
||||
<th>Screenshot</th>
|
||||
<th>Certificate</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="site-${siteIdx}-ip-${ipIdx}-ports"></tbody>
|
||||
@@ -334,10 +579,25 @@
|
||||
const ports = ip.ports || [];
|
||||
|
||||
if (ports.length === 0) {
|
||||
portsContainer.innerHTML = '<tr class="scan-row"><td colspan="7" class="text-center text-muted">No ports found</td></tr>';
|
||||
portsContainer.innerHTML = '<tr class="scan-row"><td colspan="9" class="text-center text-muted">No ports found</td></tr>';
|
||||
} else {
|
||||
ports.forEach(port => {
|
||||
const service = port.services && port.services.length > 0 ? port.services[0] : null;
|
||||
const screenshotPath = service && service.screenshot_path ? service.screenshot_path : null;
|
||||
const certificate = service && service.certificates && service.certificates.length > 0 ? service.certificates[0] : null;
|
||||
|
||||
// Build status cell with optional "Mark Expected" button
|
||||
let statusCell;
|
||||
if (port.expected) {
|
||||
statusCell = '<span class="badge badge-good">Expected</span>';
|
||||
} else {
|
||||
// Show "Unexpected" badge with "Mark Expected" button if site_id and site_ip_id are available
|
||||
const canMarkExpected = site.site_id && ip.site_ip_id;
|
||||
statusCell = `<span class="badge badge-warning">Unexpected</span>`;
|
||||
if (canMarkExpected) {
|
||||
statusCell += ` <button class="btn btn-sm btn-outline-success ms-1" onclick="markPortExpected(${site.site_id}, ${ip.site_ip_id}, ${port.port}, '${port.protocol}')" title="Add to expected ports"><i class="bi bi-plus-circle"></i></button>`;
|
||||
}
|
||||
}
|
||||
|
||||
const row = document.createElement('tr');
|
||||
row.classList.add('scan-row'); // Fix white row bug
|
||||
@@ -348,7 +608,9 @@
|
||||
<td>${service ? service.service_name : '-'}</td>
|
||||
<td>${service ? service.product || '-' : '-'}</td>
|
||||
<td class="mono">${service ? service.version || '-' : '-'}</td>
|
||||
<td>${port.expected ? '<span class="badge badge-good">Expected</span>' : '<span class="badge badge-warning">Unexpected</span>'}</td>
|
||||
<td>${statusCell}</td>
|
||||
<td>${screenshotPath ? `<a href="/output/${screenshotPath.replace(/^\/?(?:app\/)?output\/?/, '')}" target="_blank" class="btn btn-sm btn-outline-primary" title="View Screenshot"><i class="bi bi-image"></i></a>` : '-'}</td>
|
||||
<td>${certificate ? `<button class="btn btn-sm btn-outline-info" onclick='showCertificateModal(${JSON.stringify(certificate).replace(/'/g, "'")})' title="View Certificate"><i class="bi bi-shield-lock"></i></button>` : '-'}</td>
|
||||
`;
|
||||
portsContainer.appendChild(row);
|
||||
});
|
||||
@@ -447,7 +709,7 @@
|
||||
window.location.href = '{{ url_for("main.scans") }}';
|
||||
} catch (error) {
|
||||
console.error('Error deleting scan:', error);
|
||||
alert(`Failed to delete scan: ${error.message}`);
|
||||
showAlert('danger', `Failed to delete scan: ${error.message}`);
|
||||
|
||||
// Re-enable button on error
|
||||
deleteBtn.disabled = false;
|
||||
@@ -455,15 +717,136 @@
|
||||
}
|
||||
}
|
||||
|
||||
// Stop scan
|
||||
async function stopScan() {
|
||||
if (!confirm(`Are you sure you want to stop scan ${scanId}?`)) {
|
||||
return;
|
||||
}
|
||||
|
||||
const stopBtn = document.getElementById('stop-btn');
|
||||
const stopText = document.getElementById('stop-text');
|
||||
const stopSpinner = document.getElementById('stop-spinner');
|
||||
|
||||
// Show loading state
|
||||
stopBtn.disabled = true;
|
||||
stopText.style.display = 'none';
|
||||
stopSpinner.style.display = 'inline-block';
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/scans/${scanId}/stop`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
let errorMessage = `HTTP ${response.status}: Failed to stop scan`;
|
||||
try {
|
||||
const data = await response.json();
|
||||
errorMessage = data.message || errorMessage;
|
||||
} catch (e) {
|
||||
// Ignore JSON parse errors
|
||||
}
|
||||
throw new Error(errorMessage);
|
||||
}
|
||||
|
||||
// Show success message
|
||||
showAlert('success', `Stop signal sent to scan ${scanId}.`);
|
||||
|
||||
// Refresh scan data after a short delay
|
||||
setTimeout(() => {
|
||||
loadScan();
|
||||
}, 1000);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error stopping scan:', error);
|
||||
showAlert('danger', `Failed to stop scan: ${error.message}`);
|
||||
|
||||
// Re-enable button on error
|
||||
stopBtn.disabled = false;
|
||||
stopText.style.display = 'inline';
|
||||
stopSpinner.style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
// Mark a port as expected in the site config
|
||||
async function markPortExpected(siteId, ipId, portNumber, protocol) {
|
||||
try {
|
||||
// First, get the current IP settings - fetch all IPs with high per_page to find the one we need
|
||||
const getResponse = await fetch(`/api/sites/${siteId}/ips?per_page=200`);
|
||||
if (!getResponse.ok) {
|
||||
throw new Error('Failed to get site IPs');
|
||||
}
|
||||
const ipsData = await getResponse.json();
|
||||
|
||||
// Find the IP in the site
|
||||
const ipData = ipsData.ips.find(ip => ip.id === ipId);
|
||||
if (!ipData) {
|
||||
throw new Error('IP not found in site');
|
||||
}
|
||||
|
||||
// Get current expected ports
|
||||
let expectedTcpPorts = ipData.expected_tcp_ports || [];
|
||||
let expectedUdpPorts = ipData.expected_udp_ports || [];
|
||||
|
||||
// Add the new port to the appropriate list
|
||||
if (protocol.toLowerCase() === 'tcp') {
|
||||
if (!expectedTcpPorts.includes(portNumber)) {
|
||||
expectedTcpPorts.push(portNumber);
|
||||
expectedTcpPorts.sort((a, b) => a - b);
|
||||
}
|
||||
} else if (protocol.toLowerCase() === 'udp') {
|
||||
if (!expectedUdpPorts.includes(portNumber)) {
|
||||
expectedUdpPorts.push(portNumber);
|
||||
expectedUdpPorts.sort((a, b) => a - b);
|
||||
}
|
||||
}
|
||||
|
||||
// Update the IP settings
|
||||
const updateResponse = await fetch(`/api/sites/${siteId}/ips/${ipId}`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
body: JSON.stringify({
|
||||
expected_tcp_ports: expectedTcpPorts,
|
||||
expected_udp_ports: expectedUdpPorts
|
||||
})
|
||||
});
|
||||
|
||||
if (!updateResponse.ok) {
|
||||
let errorMessage = 'Failed to update IP settings';
|
||||
try {
|
||||
const errorData = await updateResponse.json();
|
||||
errorMessage = errorData.message || errorMessage;
|
||||
} catch (e) {
|
||||
// Ignore JSON parse errors
|
||||
}
|
||||
throw new Error(errorMessage);
|
||||
}
|
||||
|
||||
// Show success message
|
||||
showAlert('success', `Port ${portNumber}/${protocol.toUpperCase()} added to expected ports for this IP. Refresh the page to see updated status.`);
|
||||
|
||||
// Optionally refresh the scan data to show the change
|
||||
// Note: The scan data itself won't change, but the user knows it's been updated
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error marking port as expected:', error);
|
||||
showAlert('danger', `Failed to mark port as expected: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Find previous scan and show compare button
|
||||
let previousScanId = null;
|
||||
let currentConfigFile = null;
|
||||
let currentConfigId = null;
|
||||
async function findPreviousScan() {
|
||||
try {
|
||||
// Get current scan details first to know which config it used
|
||||
const currentScanResponse = await fetch(`/api/scans/${scanId}`);
|
||||
const currentScanData = await currentScanResponse.json();
|
||||
currentConfigFile = currentScanData.config_file;
|
||||
currentConfigId = currentScanData.config_id;
|
||||
|
||||
// Get list of completed scans
|
||||
const response = await fetch('/api/scans?per_page=100&status=completed');
|
||||
@@ -474,12 +857,12 @@
|
||||
const currentScanIndex = data.scans.findIndex(s => s.id === scanId);
|
||||
|
||||
if (currentScanIndex !== -1) {
|
||||
// Look for the most recent previous scan with the SAME config file
|
||||
// Look for the most recent previous scan with the SAME config
|
||||
for (let i = currentScanIndex + 1; i < data.scans.length; i++) {
|
||||
const previousScan = data.scans[i];
|
||||
|
||||
// Check if this scan uses the same config
|
||||
if (previousScan.config_file === currentConfigFile) {
|
||||
if (previousScan.config_id === currentConfigId) {
|
||||
previousScanId = previousScan.id;
|
||||
|
||||
// Show the compare button
|
||||
@@ -601,5 +984,97 @@
|
||||
console.error('Error loading historical chart:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Show certificate details modal
|
||||
function showCertificateModal(cert) {
|
||||
// Populate modal fields
|
||||
document.getElementById('cert-subject').textContent = cert.subject || '-';
|
||||
document.getElementById('cert-issuer').textContent = cert.issuer || '-';
|
||||
document.getElementById('cert-serial').textContent = cert.serial_number || '-';
|
||||
|
||||
// Format dates
|
||||
document.getElementById('cert-valid-from').textContent = cert.not_valid_before
|
||||
? new Date(cert.not_valid_before).toLocaleString()
|
||||
: '-';
|
||||
document.getElementById('cert-valid-until').textContent = cert.not_valid_after
|
||||
? new Date(cert.not_valid_after).toLocaleString()
|
||||
: '-';
|
||||
|
||||
// Days until expiry with color coding
|
||||
if (cert.days_until_expiry !== null && cert.days_until_expiry !== undefined) {
|
||||
let badgeClass = 'badge-success';
|
||||
if (cert.days_until_expiry < 0) {
|
||||
badgeClass = 'badge-danger';
|
||||
} else if (cert.days_until_expiry < 30) {
|
||||
badgeClass = 'badge-warning';
|
||||
}
|
||||
document.getElementById('cert-days-expiry').innerHTML =
|
||||
`<span class="badge ${badgeClass}">${cert.days_until_expiry} days</span>`;
|
||||
} else {
|
||||
document.getElementById('cert-days-expiry').textContent = '-';
|
||||
}
|
||||
|
||||
// Self-signed indicator
|
||||
document.getElementById('cert-self-signed').innerHTML = cert.is_self_signed
|
||||
? '<span class="badge badge-warning">Yes</span>'
|
||||
: '<span class="badge badge-success">No</span>';
|
||||
|
||||
// SANs
|
||||
if (cert.sans && cert.sans.length > 0) {
|
||||
document.getElementById('cert-sans').innerHTML = cert.sans
|
||||
.map(san => `<span class="badge bg-secondary me-1 mb-1">${san}</span>`)
|
||||
.join('');
|
||||
} else {
|
||||
document.getElementById('cert-sans').textContent = 'None';
|
||||
}
|
||||
|
||||
// TLS versions
|
||||
if (cert.tls_versions && cert.tls_versions.length > 0) {
|
||||
let tlsHtml = '<div class="table-responsive"><table class="table table-sm mb-0">';
|
||||
tlsHtml += '<thead><tr><th>Version</th><th>Status</th><th>Cipher Suites</th></tr></thead><tbody>';
|
||||
|
||||
cert.tls_versions.forEach(tls => {
|
||||
const statusBadge = tls.supported
|
||||
? '<span class="badge badge-success">Supported</span>'
|
||||
: '<span class="badge badge-danger">Not Supported</span>';
|
||||
|
||||
let ciphers = '-';
|
||||
if (tls.cipher_suites && tls.cipher_suites.length > 0) {
|
||||
ciphers = `<small class="text-muted">${tls.cipher_suites.length} cipher(s)</small>
|
||||
<button class="btn btn-sm btn-link p-0 ms-1" onclick="toggleCiphers(this, '${tls.tls_version}')" data-ciphers='${JSON.stringify(tls.cipher_suites).replace(/'/g, "'")}'>
|
||||
<i class="bi bi-chevron-down"></i>
|
||||
</button>
|
||||
<div class="cipher-list" style="display:none; font-size: 0.75rem; max-height: 100px; overflow-y: auto;"></div>`;
|
||||
}
|
||||
|
||||
tlsHtml += `<tr class="scan-row"><td>${tls.tls_version}</td><td>${statusBadge}</td><td>${ciphers}</td></tr>`;
|
||||
});
|
||||
|
||||
tlsHtml += '</tbody></table></div>';
|
||||
document.getElementById('cert-tls-versions').innerHTML = tlsHtml;
|
||||
} else {
|
||||
document.getElementById('cert-tls-versions').textContent = 'No TLS information available';
|
||||
}
|
||||
|
||||
// Show modal
|
||||
const modal = new bootstrap.Modal(document.getElementById('certificateModal'));
|
||||
modal.show();
|
||||
}
|
||||
|
||||
// Toggle cipher suites display
|
||||
function toggleCiphers(btn, version) {
|
||||
const cipherList = btn.nextElementSibling;
|
||||
const icon = btn.querySelector('i');
|
||||
|
||||
if (cipherList.style.display === 'none') {
|
||||
const ciphers = JSON.parse(btn.dataset.ciphers);
|
||||
cipherList.innerHTML = ciphers.map(c => `<div class="mono">${c}</div>`).join('');
|
||||
cipherList.style.display = 'block';
|
||||
icon.className = 'bi bi-chevron-up';
|
||||
} else {
|
||||
cipherList.style.display = 'none';
|
||||
icon.className = 'bi bi-chevron-down';
|
||||
}
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12 d-flex justify-content-between align-items-center mb-4">
|
||||
<h1 style="color: #60a5fa;">All Scans</h1>
|
||||
<h1>All Scans</h1>
|
||||
<button class="btn btn-primary" onclick="showTriggerScanModal()">
|
||||
<span id="trigger-btn-text">Trigger New Scan</span>
|
||||
<span id="trigger-btn-spinner" class="spinner-border spinner-border-sm ms-2" style="display: none;"></span>
|
||||
@@ -26,6 +26,7 @@
|
||||
<option value="running">Running</option>
|
||||
<option value="completed">Completed</option>
|
||||
<option value="failed">Failed</option>
|
||||
<option value="cancelled">Cancelled</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
@@ -54,7 +55,7 @@
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">Scan History</h5>
|
||||
<h5 class="mb-0">Scan History</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div id="scans-loading" class="text-center py-5">
|
||||
@@ -105,42 +106,36 @@
|
||||
<!-- Trigger Scan Modal -->
|
||||
<div class="modal fade" id="triggerScanModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content" style="background-color: #1e293b; border: 1px solid #334155;">
|
||||
<div class="modal-header" style="border-bottom: 1px solid #334155;">
|
||||
<h5 class="modal-title" style="color: #60a5fa;">Trigger New Scan</h5>
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Trigger New Scan</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<form id="trigger-scan-form">
|
||||
<div class="mb-3">
|
||||
<label for="config-file" class="form-label">Config File</label>
|
||||
<select class="form-select" id="config-file" name="config_file" required {% if not config_files %}disabled{% endif %}>
|
||||
<option value="">Select a config file...</option>
|
||||
{% for config in config_files %}
|
||||
<option value="{{ config }}">{{ config }}</option>
|
||||
{% endfor %}
|
||||
<label for="config-select" class="form-label">Scan Configuration</label>
|
||||
<select class="form-select" id="config-select" name="config_id" required>
|
||||
<option value="">Loading configurations...</option>
|
||||
</select>
|
||||
{% if config_files %}
|
||||
<div class="form-text text-muted">
|
||||
Select a scan configuration file
|
||||
<div class="form-text text-muted" id="config-help-text">
|
||||
Select a scan configuration
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="alert alert-warning mt-2 mb-0" role="alert">
|
||||
<div id="no-configs-warning" class="alert alert-warning mt-2 mb-0" role="alert" style="display: none;">
|
||||
<i class="bi bi-exclamation-triangle"></i>
|
||||
<strong>No configurations available</strong>
|
||||
<p class="mb-2 mt-2">You need to create a configuration file before you can trigger a scan.</p>
|
||||
<a href="{{ url_for('main.upload_config') }}" class="btn btn-sm btn-primary">
|
||||
<p class="mb-2 mt-2">You need to create a configuration before you can trigger a scan.</p>
|
||||
<a href="{{ url_for('main.configs') }}" class="btn btn-sm btn-primary">
|
||||
<i class="bi bi-plus-circle"></i> Create Configuration
|
||||
</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div id="trigger-error" class="alert alert-danger" style="display: none;"></div>
|
||||
</form>
|
||||
</div>
|
||||
<div class="modal-footer" style="border-top: 1px solid #334155;">
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
|
||||
<button type="button" class="btn btn-primary" onclick="triggerScan()" {% if not config_files %}disabled{% endif %}>
|
||||
<button type="button" class="btn btn-primary" id="trigger-scan-btn" onclick="triggerScan()">
|
||||
<span id="modal-trigger-text">Trigger Scan</span>
|
||||
<span id="modal-trigger-spinner" class="spinner-border spinner-border-sm ms-2" style="display: none;"></span>
|
||||
</button>
|
||||
@@ -157,6 +152,25 @@
|
||||
let statusFilter = '';
|
||||
let totalCount = 0;
|
||||
|
||||
// Show alert notification
|
||||
function showAlert(type, message) {
|
||||
const container = document.getElementById('notification-container');
|
||||
const notification = document.createElement('div');
|
||||
notification.className = `alert alert-${type} alert-dismissible fade show mb-2`;
|
||||
|
||||
notification.innerHTML = `
|
||||
${message}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
`;
|
||||
|
||||
container.appendChild(notification);
|
||||
|
||||
// Auto-dismiss after 5 seconds
|
||||
setTimeout(() => {
|
||||
notification.remove();
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
// Load initial data when page loads
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
loadScans();
|
||||
@@ -235,20 +249,27 @@
|
||||
statusBadge = '<span class="badge badge-info">Running</span>';
|
||||
} else if (scan.status === 'failed') {
|
||||
statusBadge = '<span class="badge badge-danger">Failed</span>';
|
||||
} else if (scan.status === 'cancelled') {
|
||||
statusBadge = '<span class="badge badge-warning">Cancelled</span>';
|
||||
} else {
|
||||
statusBadge = `<span class="badge badge-info">${scan.status}</span>`;
|
||||
}
|
||||
|
||||
// Action buttons
|
||||
let actionButtons = `<a href="/scans/${scan.id}" class="btn btn-sm btn-secondary">View</a>`;
|
||||
if (scan.status === 'running') {
|
||||
actionButtons += `<button class="btn btn-sm btn-warning ms-1" onclick="stopScan(${scan.id})">Stop</button>`;
|
||||
} else {
|
||||
actionButtons += `<button class="btn btn-sm btn-danger ms-1" onclick="deleteScan(${scan.id})">Delete</button>`;
|
||||
}
|
||||
|
||||
row.innerHTML = `
|
||||
<td class="mono">${scan.id}</td>
|
||||
<td>${scan.title || 'Untitled Scan'}</td>
|
||||
<td class="text-muted">${timestamp}</td>
|
||||
<td class="mono">${duration}</td>
|
||||
<td>${statusBadge}</td>
|
||||
<td>
|
||||
<a href="/scans/${scan.id}" class="btn btn-sm btn-secondary">View</a>
|
||||
${scan.status !== 'running' ? `<button class="btn btn-sm btn-danger ms-1" onclick="deleteScan(${scan.id})">Delete</button>` : ''}
|
||||
</td>
|
||||
<td>${actionButtons}</td>
|
||||
`;
|
||||
|
||||
tbody.appendChild(row);
|
||||
@@ -359,23 +380,75 @@
|
||||
});
|
||||
}
|
||||
|
||||
// Load available configs
|
||||
async function loadConfigs() {
|
||||
const selectEl = document.getElementById('config-select');
|
||||
const helpTextEl = document.getElementById('config-help-text');
|
||||
const noConfigsWarning = document.getElementById('no-configs-warning');
|
||||
const triggerBtn = document.getElementById('trigger-scan-btn');
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/configs');
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load configurations');
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
const configs = data.configs || [];
|
||||
|
||||
// Clear existing options
|
||||
selectEl.innerHTML = '';
|
||||
|
||||
if (configs.length === 0) {
|
||||
selectEl.innerHTML = '<option value="">No configurations available</option>';
|
||||
selectEl.disabled = true;
|
||||
triggerBtn.disabled = true;
|
||||
helpTextEl.style.display = 'none';
|
||||
noConfigsWarning.style.display = 'block';
|
||||
} else {
|
||||
selectEl.innerHTML = '<option value="">Select a configuration...</option>';
|
||||
configs.forEach(config => {
|
||||
const option = document.createElement('option');
|
||||
option.value = config.id;
|
||||
const siteText = config.site_count === 1 ? 'site' : 'sites';
|
||||
option.textContent = `${config.title} (${config.site_count} ${siteText})`;
|
||||
selectEl.appendChild(option);
|
||||
});
|
||||
selectEl.disabled = false;
|
||||
triggerBtn.disabled = false;
|
||||
helpTextEl.style.display = 'block';
|
||||
noConfigsWarning.style.display = 'none';
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading configs:', error);
|
||||
selectEl.innerHTML = '<option value="">Error loading configurations</option>';
|
||||
selectEl.disabled = true;
|
||||
triggerBtn.disabled = true;
|
||||
helpTextEl.style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
// Show trigger scan modal
|
||||
function showTriggerScanModal() {
|
||||
const modal = new bootstrap.Modal(document.getElementById('triggerScanModal'));
|
||||
document.getElementById('trigger-error').style.display = 'none';
|
||||
document.getElementById('trigger-scan-form').reset();
|
||||
|
||||
// Load configs when modal is shown
|
||||
loadConfigs();
|
||||
|
||||
modal.show();
|
||||
}
|
||||
|
||||
// Trigger scan
|
||||
async function triggerScan() {
|
||||
const configFile = document.getElementById('config-file').value;
|
||||
const configId = document.getElementById('config-select').value;
|
||||
const errorEl = document.getElementById('trigger-error');
|
||||
const btnText = document.getElementById('modal-trigger-text');
|
||||
const btnSpinner = document.getElementById('modal-trigger-spinner');
|
||||
|
||||
if (!configFile) {
|
||||
errorEl.textContent = 'Please enter a config file path.';
|
||||
if (!configId) {
|
||||
errorEl.textContent = 'Please select a configuration.';
|
||||
errorEl.style.display = 'block';
|
||||
return;
|
||||
}
|
||||
@@ -392,13 +465,13 @@
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
config_file: configFile
|
||||
config_id: parseInt(configId)
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json();
|
||||
throw new Error(data.error || 'Failed to trigger scan');
|
||||
throw new Error(data.message || data.error || 'Failed to trigger scan');
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
@@ -410,15 +483,7 @@
|
||||
bootstrap.Modal.getInstance(document.getElementById('triggerScanModal')).hide();
|
||||
|
||||
// Show success message
|
||||
const alertDiv = document.createElement('div');
|
||||
alertDiv.className = 'alert alert-success alert-dismissible fade show mt-3';
|
||||
alertDiv.innerHTML = `
|
||||
Scan triggered successfully! (ID: ${data.scan_id})
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
`;
|
||||
// Insert at the beginning of container-fluid
|
||||
const container = document.querySelector('.container-fluid');
|
||||
container.insertBefore(alertDiv, container.firstChild);
|
||||
showAlert('success', `Scan triggered successfully! (ID: ${data.scan_id})`);
|
||||
|
||||
// Refresh scans
|
||||
loadScans();
|
||||
@@ -432,6 +497,33 @@
|
||||
}
|
||||
}
|
||||
|
||||
// Stop scan
|
||||
async function stopScan(scanId) {
|
||||
if (!confirm(`Are you sure you want to stop scan ${scanId}?`)) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/scans/${scanId}/stop`, {
|
||||
method: 'POST'
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json();
|
||||
throw new Error(data.message || 'Failed to stop scan');
|
||||
}
|
||||
|
||||
// Show success message
|
||||
showAlert('success', `Stop signal sent to scan ${scanId}.`);
|
||||
|
||||
// Refresh scans after a short delay
|
||||
setTimeout(() => loadScans(), 1000);
|
||||
} catch (error) {
|
||||
console.error('Error stopping scan:', error);
|
||||
showAlert('danger', `Failed to stop scan: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Delete scan
|
||||
async function deleteScan(scanId) {
|
||||
if (!confirm(`Are you sure you want to delete scan ${scanId}?`)) {
|
||||
@@ -444,44 +536,20 @@
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to delete scan');
|
||||
const data = await response.json();
|
||||
throw new Error(data.message || 'Failed to delete scan');
|
||||
}
|
||||
|
||||
// Show success message
|
||||
const alertDiv = document.createElement('div');
|
||||
alertDiv.className = 'alert alert-success alert-dismissible fade show mt-3';
|
||||
alertDiv.innerHTML = `
|
||||
Scan ${scanId} deleted successfully.
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
`;
|
||||
document.querySelector('.container-fluid').insertBefore(alertDiv, document.querySelector('.row'));
|
||||
showAlert('success', `Scan ${scanId} deleted successfully.`);
|
||||
|
||||
// Refresh scans
|
||||
loadScans();
|
||||
} catch (error) {
|
||||
console.error('Error deleting scan:', error);
|
||||
alert('Failed to delete scan. Please try again.');
|
||||
showAlert('danger', `Failed to delete scan: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Custom pagination styles
|
||||
const style = document.createElement('style');
|
||||
style.textContent = `
|
||||
.pagination {
|
||||
--bs-pagination-bg: #1e293b;
|
||||
--bs-pagination-border-color: #334155;
|
||||
--bs-pagination-hover-bg: #334155;
|
||||
--bs-pagination-hover-border-color: #475569;
|
||||
--bs-pagination-focus-bg: #334155;
|
||||
--bs-pagination-active-bg: #3b82f6;
|
||||
--bs-pagination-active-border-color: #3b82f6;
|
||||
--bs-pagination-disabled-bg: #0f172a;
|
||||
--bs-pagination-disabled-border-color: #334155;
|
||||
--bs-pagination-color: #e2e8f0;
|
||||
--bs-pagination-hover-color: #e2e8f0;
|
||||
--bs-pagination-disabled-color: #64748b;
|
||||
}
|
||||
`;
|
||||
document.head.appendChild(style);
|
||||
</script>
|
||||
{% endblock %}
|
||||
|
||||
@@ -32,13 +32,13 @@
|
||||
<small class="form-text text-muted">A descriptive name for this schedule</small>
|
||||
</div>
|
||||
|
||||
<!-- Config File -->
|
||||
<!-- Config -->
|
||||
<div class="mb-3">
|
||||
<label for="config-file" class="form-label">Configuration File <span class="text-danger">*</span></label>
|
||||
<select class="form-select" id="config-file" name="config_file" required>
|
||||
<option value="">Select a configuration file...</option>
|
||||
{% for config in config_files %}
|
||||
<option value="{{ config }}">{{ config }}</option>
|
||||
<label for="config-id" class="form-label">Configuration <span class="text-danger">*</span></label>
|
||||
<select class="form-select" id="config-id" name="config_id" required>
|
||||
<option value="">Select a configuration...</option>
|
||||
{% for config in configs %}
|
||||
<option value="{{ config.id }}">{{ config.title }}</option>
|
||||
{% endfor %}
|
||||
</select>
|
||||
<small class="form-text text-muted">The scan configuration to use for this schedule</small>
|
||||
@@ -369,13 +369,13 @@ document.getElementById('create-schedule-form').addEventListener('submit', async
|
||||
// Get form data
|
||||
const formData = {
|
||||
name: document.getElementById('schedule-name').value.trim(),
|
||||
config_file: document.getElementById('config-file').value,
|
||||
config_id: parseInt(document.getElementById('config-id').value),
|
||||
cron_expression: document.getElementById('cron-expression').value.trim(),
|
||||
enabled: document.getElementById('schedule-enabled').checked
|
||||
};
|
||||
|
||||
// Validate
|
||||
if (!formData.name || !formData.config_file || !formData.cron_expression) {
|
||||
if (!formData.name || !formData.config_id || !formData.cron_expression) {
|
||||
showNotification('Please fill in all required fields', 'warning');
|
||||
return;
|
||||
}
|
||||
@@ -419,20 +419,16 @@ document.getElementById('create-schedule-form').addEventListener('submit', async
|
||||
|
||||
// Show notification
|
||||
function showNotification(message, type = 'info') {
|
||||
const container = document.getElementById('notification-container');
|
||||
const notification = document.createElement('div');
|
||||
notification.className = `alert alert-${type} alert-dismissible fade show`;
|
||||
notification.style.position = 'fixed';
|
||||
notification.style.top = '20px';
|
||||
notification.style.right = '20px';
|
||||
notification.style.zIndex = '9999';
|
||||
notification.style.minWidth = '300px';
|
||||
notification.className = `alert alert-${type} alert-dismissible fade show mb-2`;
|
||||
|
||||
notification.innerHTML = `
|
||||
${message}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
`;
|
||||
|
||||
document.body.appendChild(notification);
|
||||
container.appendChild(notification);
|
||||
|
||||
setTimeout(() => {
|
||||
notification.remove();
|
||||
|
||||
@@ -298,7 +298,11 @@ async function loadSchedule() {
|
||||
function populateForm(schedule) {
|
||||
document.getElementById('schedule-id').value = schedule.id;
|
||||
document.getElementById('schedule-name').value = schedule.name;
|
||||
document.getElementById('config-file').value = schedule.config_file;
|
||||
// Display config name and ID in the readonly config-file field
|
||||
const configDisplay = schedule.config_name
|
||||
? `${schedule.config_name} (ID: ${schedule.config_id})`
|
||||
: `Config ID: ${schedule.config_id}`;
|
||||
document.getElementById('config-file').value = configDisplay;
|
||||
document.getElementById('cron-expression').value = schedule.cron_expression;
|
||||
document.getElementById('schedule-enabled').checked = schedule.enabled;
|
||||
|
||||
@@ -554,20 +558,16 @@ async function deleteSchedule() {
|
||||
|
||||
// Show notification
|
||||
function showNotification(message, type = 'info') {
|
||||
const container = document.getElementById('notification-container');
|
||||
const notification = document.createElement('div');
|
||||
notification.className = `alert alert-${type} alert-dismissible fade show`;
|
||||
notification.style.position = 'fixed';
|
||||
notification.style.top = '20px';
|
||||
notification.style.right = '20px';
|
||||
notification.style.zIndex = '9999';
|
||||
notification.style.minWidth = '300px';
|
||||
notification.className = `alert alert-${type} alert-dismissible fade show mb-2`;
|
||||
|
||||
notification.innerHTML = `
|
||||
${message}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
`;
|
||||
|
||||
document.body.appendChild(notification);
|
||||
container.appendChild(notification);
|
||||
|
||||
setTimeout(() => {
|
||||
notification.remove();
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
<div class="row mt-4">
|
||||
<div class="col-12">
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<h1 style="color: #60a5fa;">Scheduled Scans</h1>
|
||||
<h1>Scheduled Scans</h1>
|
||||
<a href="{{ url_for('main.create_schedule') }}" class="btn btn-primary">
|
||||
<i class="bi bi-plus-circle"></i> New Schedule
|
||||
</a>
|
||||
@@ -47,7 +47,7 @@
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 class="mb-0" style="color: #60a5fa;">All Schedules</h5>
|
||||
<h5 class="mb-0">All Schedules</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div id="schedules-loading" class="text-center py-5">
|
||||
@@ -198,7 +198,7 @@ function renderSchedules() {
|
||||
<td>
|
||||
<strong>${escapeHtml(schedule.name)}</strong>
|
||||
<br>
|
||||
<small class="text-muted">${escapeHtml(schedule.config_file)}</small>
|
||||
<small class="text-muted">Config ID: ${schedule.config_id || 'N/A'}</small>
|
||||
</td>
|
||||
<td class="mono"><code>${escapeHtml(schedule.cron_expression)}</code></td>
|
||||
<td>${formatRelativeTime(schedule.next_run)}</td>
|
||||
@@ -352,21 +352,16 @@ async function deleteSchedule(scheduleId) {
|
||||
|
||||
// Show notification
|
||||
function showNotification(message, type = 'info') {
|
||||
// Create notification element
|
||||
const container = document.getElementById('notification-container');
|
||||
const notification = document.createElement('div');
|
||||
notification.className = `alert alert-${type} alert-dismissible fade show`;
|
||||
notification.style.position = 'fixed';
|
||||
notification.style.top = '20px';
|
||||
notification.style.right = '20px';
|
||||
notification.style.zIndex = '9999';
|
||||
notification.style.minWidth = '300px';
|
||||
notification.className = `alert alert-${type} alert-dismissible fade show mb-2`;
|
||||
|
||||
notification.innerHTML = `
|
||||
${message}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
`;
|
||||
|
||||
document.body.appendChild(notification);
|
||||
container.appendChild(notification);
|
||||
|
||||
// Auto-remove after 5 seconds
|
||||
setTimeout(() => {
|
||||
|
||||
@@ -1,95 +1,60 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en" data-bs-theme="dark">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Setup - SneakyScanner</title>
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<style>
|
||||
body {
|
||||
height: 100vh;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
background: linear-gradient(135deg, #1a1a2e 0%, #16213e 100%);
|
||||
}
|
||||
.setup-container {
|
||||
width: 100%;
|
||||
max-width: 500px;
|
||||
padding: 2rem;
|
||||
}
|
||||
.card {
|
||||
border: none;
|
||||
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.3);
|
||||
}
|
||||
.brand-title {
|
||||
color: #00d9ff;
|
||||
font-weight: 600;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="setup-container">
|
||||
<div class="card">
|
||||
<div class="card-body p-5">
|
||||
<div class="text-center mb-4">
|
||||
<h1 class="brand-title">SneakyScanner</h1>
|
||||
<p class="text-muted">Initial Setup</p>
|
||||
</div>
|
||||
{% extends "base.html" %}
|
||||
|
||||
<div class="alert alert-info mb-4">
|
||||
<strong>Welcome!</strong> Please set an application password to secure your scanner.
|
||||
</div>
|
||||
{% block title %}Setup - SneakyScanner{% endblock %}
|
||||
|
||||
{% with messages = get_flashed_messages(with_categories=true) %}
|
||||
{% if messages %}
|
||||
{% for category, message in messages %}
|
||||
<div class="alert alert-{{ 'danger' if category == 'error' else category }} alert-dismissible fade show" role="alert">
|
||||
{{ message }}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
{% endwith %}
|
||||
{% set hide_nav = true %}
|
||||
|
||||
<form method="post" action="{{ url_for('auth.setup') }}">
|
||||
<div class="mb-3">
|
||||
<label for="password" class="form-label">Password</label>
|
||||
<input type="password"
|
||||
class="form-control"
|
||||
id="password"
|
||||
name="password"
|
||||
required
|
||||
minlength="8"
|
||||
autofocus
|
||||
placeholder="Enter password (min 8 characters)">
|
||||
<div class="form-text">Password must be at least 8 characters long.</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-4">
|
||||
<label for="confirm_password" class="form-label">Confirm Password</label>
|
||||
<input type="password"
|
||||
class="form-control"
|
||||
id="confirm_password"
|
||||
name="confirm_password"
|
||||
required
|
||||
minlength="8"
|
||||
placeholder="Confirm your password">
|
||||
</div>
|
||||
|
||||
<button type="submit" class="btn btn-primary btn-lg w-100">
|
||||
Set Password
|
||||
</button>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="text-center mt-3">
|
||||
<small class="text-muted">SneakyScanner v1.0 - Phase 2</small>
|
||||
</div>
|
||||
{% block content %}
|
||||
<div class="login-card">
|
||||
<div class="text-center mb-4">
|
||||
<h1 class="brand-title">SneakyScanner</h1>
|
||||
<p class="brand-subtitle">Initial Setup</p>
|
||||
</div>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
<div class="alert alert-info mb-4">
|
||||
<i class="bi bi-info-circle me-1"></i>
|
||||
<strong>Welcome!</strong> Please set an application password to secure your scanner.
|
||||
</div>
|
||||
|
||||
{% with messages = get_flashed_messages(with_categories=true) %}
|
||||
{% if messages %}
|
||||
{% for category, message in messages %}
|
||||
<div class="alert alert-{{ 'danger' if category == 'error' else category }} alert-dismissible fade show" role="alert">
|
||||
{{ message }}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert"></button>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
{% endwith %}
|
||||
|
||||
<form method="post" action="{{ url_for('auth.setup') }}">
|
||||
<div class="mb-3">
|
||||
<label for="password" class="form-label">Password</label>
|
||||
<input type="password"
|
||||
class="form-control form-control-lg"
|
||||
id="password"
|
||||
name="password"
|
||||
required
|
||||
minlength="8"
|
||||
autofocus
|
||||
placeholder="Enter password (min 8 characters)">
|
||||
<div class="form-text">Password must be at least 8 characters long.</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-4">
|
||||
<label for="confirm_password" class="form-label">Confirm Password</label>
|
||||
<input type="password"
|
||||
class="form-control form-control-lg"
|
||||
id="confirm_password"
|
||||
name="confirm_password"
|
||||
required
|
||||
minlength="8"
|
||||
placeholder="Confirm your password">
|
||||
</div>
|
||||
|
||||
<button type="submit" class="btn btn-primary btn-lg w-100">
|
||||
Set Password
|
||||
</button>
|
||||
</form>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
1187
app/web/templates/sites.html
Normal file
1187
app/web/templates/sites.html
Normal file
File diff suppressed because it is too large
Load Diff
9
app/web/templates/webhook_presets/custom_json.j2
Normal file
9
app/web/templates/webhook_presets/custom_json.j2
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"title": "{{ scan.title }} - {{ alert.type|title|replace('_', ' ') }}",
|
||||
"message": "{{ alert.message }}{% if alert.ip_address %} on {{ alert.ip_address }}{% endif %}{% if alert.port %}:{{ alert.port }}{% endif %}",
|
||||
"priority": {% if alert.severity == 'critical' %}5{% elif alert.severity == 'warning' %}3{% else %}1{% endif %},
|
||||
"severity": "{{ alert.severity }}",
|
||||
"scan_id": {{ scan.id }},
|
||||
"alert_id": {{ alert.id }},
|
||||
"timestamp": "{{ timestamp.isoformat() }}"
|
||||
}
|
||||
25
app/web/templates/webhook_presets/default_json.j2
Normal file
25
app/web/templates/webhook_presets/default_json.j2
Normal file
@@ -0,0 +1,25 @@
|
||||
{
|
||||
"event": "alert.created",
|
||||
"alert": {
|
||||
"id": {{ alert.id }},
|
||||
"type": "{{ alert.type }}",
|
||||
"severity": "{{ alert.severity }}",
|
||||
"message": "{{ alert.message }}",
|
||||
{% if alert.ip_address %}"ip_address": "{{ alert.ip_address }}",{% endif %}
|
||||
{% if alert.port %}"port": {{ alert.port }},{% endif %}
|
||||
"acknowledged": {{ alert.acknowledged|lower }},
|
||||
"created_at": "{{ alert.created_at.isoformat() }}"
|
||||
},
|
||||
"scan": {
|
||||
"id": {{ scan.id }},
|
||||
"title": "{{ scan.title }}",
|
||||
"timestamp": "{{ scan.timestamp.isoformat() }}",
|
||||
"status": "{{ scan.status }}"
|
||||
},
|
||||
"rule": {
|
||||
"id": {{ rule.id }},
|
||||
"name": "{{ rule.name }}",
|
||||
"type": "{{ rule.type }}",
|
||||
"threshold": {{ rule.threshold if rule.threshold else 'null' }}
|
||||
}
|
||||
}
|
||||
41
app/web/templates/webhook_presets/discord.j2
Normal file
41
app/web/templates/webhook_presets/discord.j2
Normal file
@@ -0,0 +1,41 @@
|
||||
{
|
||||
"username": "SneakyScanner",
|
||||
"embeds": [
|
||||
{
|
||||
"title": "{{ alert.type|title|replace('_', ' ') }} Alert",
|
||||
"description": "{{ alert.message }}",
|
||||
"color": {% if alert.severity == 'critical' %}15158332{% elif alert.severity == 'warning' %}16776960{% else %}3447003{% endif %},
|
||||
"fields": [
|
||||
{
|
||||
"name": "Severity",
|
||||
"value": "{{ alert.severity|upper }}",
|
||||
"inline": true
|
||||
},
|
||||
{
|
||||
"name": "Scan",
|
||||
"value": "{{ scan.title }}",
|
||||
"inline": true
|
||||
},
|
||||
{
|
||||
"name": "Rule",
|
||||
"value": "{{ rule.name }}",
|
||||
"inline": false
|
||||
}{% if alert.ip_address %},
|
||||
{
|
||||
"name": "IP Address",
|
||||
"value": "{{ alert.ip_address }}",
|
||||
"inline": true
|
||||
}{% endif %}{% if alert.port %},
|
||||
{
|
||||
"name": "Port",
|
||||
"value": "{{ alert.port }}",
|
||||
"inline": true
|
||||
}{% endif %}
|
||||
],
|
||||
"footer": {
|
||||
"text": "Alert ID: {{ alert.id }} | Scan ID: {{ scan.id }}"
|
||||
},
|
||||
"timestamp": "{{ timestamp.isoformat() }}"
|
||||
}
|
||||
]
|
||||
}
|
||||
13
app/web/templates/webhook_presets/gotify.j2
Normal file
13
app/web/templates/webhook_presets/gotify.j2
Normal file
@@ -0,0 +1,13 @@
|
||||
{
|
||||
"title": "{{ scan.title }}",
|
||||
"message": "**{{ alert.severity|upper }}**: {{ alert.message }}\n\n**Scan:** {{ scan.title }}\n**Status:** {{ scan.status }}\n**Rule:** {{ rule.name }}{% if alert.ip_address %}\n**IP:** {{ alert.ip_address }}{% endif %}{% if alert.port %}\n**Port:** {{ alert.port }}{% endif %}",
|
||||
"priority": {% if alert.severity == 'critical' %}8{% elif alert.severity == 'warning' %}5{% else %}2{% endif %},
|
||||
"extras": {
|
||||
"client::display": {
|
||||
"contentType": "text/markdown"
|
||||
},
|
||||
"alert_id": {{ alert.id }},
|
||||
"scan_id": {{ scan.id }},
|
||||
"alert_type": "{{ alert.type }}"
|
||||
}
|
||||
}
|
||||
10
app/web/templates/webhook_presets/ntfy.j2
Normal file
10
app/web/templates/webhook_presets/ntfy.j2
Normal file
@@ -0,0 +1,10 @@
|
||||
{{ alert.message }}
|
||||
|
||||
Scan: {{ scan.title }}
|
||||
Rule: {{ rule.name }}
|
||||
Severity: {{ alert.severity|upper }}{% if alert.ip_address %}
|
||||
IP: {{ alert.ip_address }}{% endif %}{% if alert.port %}
|
||||
Port: {{ alert.port }}{% endif %}
|
||||
|
||||
Scan Status: {{ scan.status }}
|
||||
Alert ID: {{ alert.id }}
|
||||
27
app/web/templates/webhook_presets/plain_text.j2
Normal file
27
app/web/templates/webhook_presets/plain_text.j2
Normal file
@@ -0,0 +1,27 @@
|
||||
SNEAKYSCANNER ALERT - {{ alert.severity|upper }}
|
||||
|
||||
Alert: {{ alert.message }}
|
||||
Type: {{ alert.type|title|replace('_', ' ') }}
|
||||
Severity: {{ alert.severity|upper }}
|
||||
|
||||
Scan Information:
|
||||
Title: {{ scan.title }}
|
||||
Status: {{ scan.status }}
|
||||
Duration: {{ scan.duration }}s
|
||||
Triggered By: {{ scan.triggered_by }}
|
||||
|
||||
Rule Information:
|
||||
Name: {{ rule.name }}
|
||||
Type: {{ rule.type }}
|
||||
{% if rule.threshold %} Threshold: {{ rule.threshold }}
|
||||
{% endif %}
|
||||
{% if alert.ip_address %}IP Address: {{ alert.ip_address }}
|
||||
{% endif %}{% if alert.port %}Port: {{ alert.port }}
|
||||
{% endif %}
|
||||
Alert ID: {{ alert.id }}
|
||||
Scan ID: {{ scan.id }}
|
||||
Timestamp: {{ timestamp.strftime('%Y-%m-%d %H:%M:%S UTC') }}
|
||||
|
||||
---
|
||||
Generated by {{ app.name }} v{{ app.version }}
|
||||
{{ app.url }}
|
||||
65
app/web/templates/webhook_presets/presets.json
Normal file
65
app/web/templates/webhook_presets/presets.json
Normal file
@@ -0,0 +1,65 @@
|
||||
[
|
||||
{
|
||||
"id": "default_json",
|
||||
"name": "Default JSON (Current Format)",
|
||||
"description": "Standard webhook payload format matching the current implementation",
|
||||
"format": "json",
|
||||
"content_type": "application/json",
|
||||
"file": "default_json.j2",
|
||||
"category": "general"
|
||||
},
|
||||
{
|
||||
"id": "custom_json",
|
||||
"name": "Custom JSON",
|
||||
"description": "Flexible custom JSON format with configurable title, message, and priority fields",
|
||||
"format": "json",
|
||||
"content_type": "application/json",
|
||||
"file": "custom_json.j2",
|
||||
"category": "general"
|
||||
},
|
||||
{
|
||||
"id": "gotify",
|
||||
"name": "Gotify",
|
||||
"description": "Optimized for Gotify push notification server with markdown support",
|
||||
"format": "json",
|
||||
"content_type": "application/json",
|
||||
"file": "gotify.j2",
|
||||
"category": "service"
|
||||
},
|
||||
{
|
||||
"id": "ntfy",
|
||||
"name": "Ntfy",
|
||||
"description": "Simple text format for Ntfy pub-sub notification service",
|
||||
"format": "text",
|
||||
"content_type": "text/plain",
|
||||
"file": "ntfy.j2",
|
||||
"category": "service"
|
||||
},
|
||||
{
|
||||
"id": "slack",
|
||||
"name": "Slack",
|
||||
"description": "Rich Block Kit format for Slack webhooks with visual formatting",
|
||||
"format": "json",
|
||||
"content_type": "application/json",
|
||||
"file": "slack.j2",
|
||||
"category": "service"
|
||||
},
|
||||
{
|
||||
"id": "discord",
|
||||
"name": "Discord",
|
||||
"description": "Embedded message format for Discord webhooks with color-coded severity",
|
||||
"format": "json",
|
||||
"content_type": "application/json",
|
||||
"file": "discord.j2",
|
||||
"category": "service"
|
||||
},
|
||||
{
|
||||
"id": "plain_text",
|
||||
"name": "Plain Text",
|
||||
"description": "Simple plain text format for logging or basic notification services",
|
||||
"format": "text",
|
||||
"content_type": "text/plain",
|
||||
"file": "plain_text.j2",
|
||||
"category": "general"
|
||||
}
|
||||
]
|
||||
60
app/web/templates/webhook_presets/slack.j2
Normal file
60
app/web/templates/webhook_presets/slack.j2
Normal file
@@ -0,0 +1,60 @@
|
||||
{
|
||||
"text": "{{ alert.severity|upper }}: {{ alert.message }}",
|
||||
"blocks": [
|
||||
{
|
||||
"type": "header",
|
||||
"text": {
|
||||
"type": "plain_text",
|
||||
"text": "🚨 {{ alert.severity|upper }} Alert: {{ alert.type|title|replace('_', ' ') }}"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "section",
|
||||
"fields": [
|
||||
{
|
||||
"type": "mrkdwn",
|
||||
"text": "*Alert:*\n{{ alert.message }}"
|
||||
},
|
||||
{
|
||||
"type": "mrkdwn",
|
||||
"text": "*Severity:*\n{{ alert.severity|upper }}"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "section",
|
||||
"fields": [
|
||||
{
|
||||
"type": "mrkdwn",
|
||||
"text": "*Scan:*\n{{ scan.title }}"
|
||||
},
|
||||
{
|
||||
"type": "mrkdwn",
|
||||
"text": "*Rule:*\n{{ rule.name }}"
|
||||
}
|
||||
]
|
||||
}{% if alert.ip_address or alert.port %},
|
||||
{
|
||||
"type": "section",
|
||||
"fields": [{% if alert.ip_address %}
|
||||
{
|
||||
"type": "mrkdwn",
|
||||
"text": "*IP Address:*\n{{ alert.ip_address }}"
|
||||
}{% if alert.port %},{% endif %}{% endif %}{% if alert.port %}
|
||||
{
|
||||
"type": "mrkdwn",
|
||||
"text": "*Port:*\n{{ alert.port }}"
|
||||
}{% endif %}
|
||||
]
|
||||
}{% endif %},
|
||||
{
|
||||
"type": "context",
|
||||
"elements": [
|
||||
{
|
||||
"type": "mrkdwn",
|
||||
"text": "Scan ID: {{ scan.id }} | Alert ID: {{ alert.id }} | {{ timestamp.strftime('%Y-%m-%d %H:%M:%S UTC') }}"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
633
app/web/templates/webhooks/form.html
Normal file
633
app/web/templates/webhooks/form.html
Normal file
@@ -0,0 +1,633 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}{{ 'Edit' if mode == 'edit' else 'New' }} Webhook - SneakyScanner{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12 mb-4">
|
||||
<h1 style="color: #60a5fa;">{{ 'Edit' if mode == 'edit' else 'Create' }} Webhook</h1>
|
||||
<nav aria-label="breadcrumb">
|
||||
<ol class="breadcrumb">
|
||||
<li class="breadcrumb-item"><a href="{{ url_for('webhooks.list_webhooks') }}">Webhooks</a></li>
|
||||
<li class="breadcrumb-item active">{{ 'Edit' if mode == 'edit' else 'New' }}</li>
|
||||
</ol>
|
||||
</nav>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row">
|
||||
<div class="col-lg-8">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<form id="webhook-form">
|
||||
<!-- Basic Information -->
|
||||
<h5 class="card-title mb-3">Basic Information</h5>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="name" class="form-label">Webhook Name <span class="text-danger">*</span></label>
|
||||
<input type="text" class="form-control" id="name" name="name" required
|
||||
placeholder="e.g., Slack Notifications">
|
||||
<div class="form-text">A descriptive name for this webhook</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="url" class="form-label">Webhook URL <span class="text-danger">*</span></label>
|
||||
<input type="url" class="form-control" id="url" name="url" required
|
||||
placeholder="https://hooks.example.com/webhook">
|
||||
<div class="form-text">The endpoint where alerts will be sent</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<div class="form-check form-switch">
|
||||
<input class="form-check-input" type="checkbox" id="enabled" name="enabled" checked>
|
||||
<label class="form-check-label" for="enabled">Enabled</label>
|
||||
</div>
|
||||
<div class="form-text">Disabled webhooks will not receive notifications</div>
|
||||
</div>
|
||||
|
||||
<hr class="my-4">
|
||||
|
||||
<!-- Authentication -->
|
||||
<h5 class="card-title mb-3">Authentication</h5>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="auth_type" class="form-label">Authentication Type</label>
|
||||
<select class="form-select" id="auth_type" name="auth_type">
|
||||
<option value="none">None</option>
|
||||
<option value="bearer">Bearer Token</option>
|
||||
<option value="basic">Basic Auth (username:password)</option>
|
||||
<option value="custom">Custom Headers</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="mb-3" id="auth_token_field" style="display: none;">
|
||||
<label for="auth_token" class="form-label">Authentication Token</label>
|
||||
<input type="password" class="form-control" id="auth_token" name="auth_token"
|
||||
placeholder="Enter token or username:password">
|
||||
<div class="form-text" id="auth_token_help">Will be encrypted when stored</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3" id="custom_headers_field" style="display: none;">
|
||||
<label for="custom_headers" class="form-label">Custom Headers (JSON)</label>
|
||||
<textarea class="form-control font-monospace" id="custom_headers" name="custom_headers" rows="4"
|
||||
placeholder='{"X-API-Key": "your-key", "X-Custom-Header": "value"}'></textarea>
|
||||
<div class="form-text">JSON object with custom HTTP headers</div>
|
||||
</div>
|
||||
|
||||
<hr class="my-4">
|
||||
|
||||
<!-- Filters -->
|
||||
<h5 class="card-title mb-3">Alert Filters</h5>
|
||||
|
||||
<div class="mb-3">
|
||||
<label class="form-label">Alert Types</label>
|
||||
<div class="form-text mb-2">Select which alert types trigger this webhook (leave all unchecked for all types)</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input alert-type-check" type="checkbox" value="unexpected_port" id="type_unexpected">
|
||||
<label class="form-check-label" for="type_unexpected">Unexpected Port</label>
|
||||
</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input alert-type-check" type="checkbox" value="drift_detection" id="type_drift">
|
||||
<label class="form-check-label" for="type_drift">Drift Detection</label>
|
||||
</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input alert-type-check" type="checkbox" value="cert_expiry" id="type_cert">
|
||||
<label class="form-check-label" for="type_cert">Certificate Expiry</label>
|
||||
</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input alert-type-check" type="checkbox" value="weak_tls" id="type_tls">
|
||||
<label class="form-check-label" for="type_tls">Weak TLS</label>
|
||||
</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input alert-type-check" type="checkbox" value="ping_failed" id="type_ping">
|
||||
<label class="form-check-label" for="type_ping">Ping Failed</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label class="form-label">Severity Filter</label>
|
||||
<div class="form-text mb-2">Select which severities trigger this webhook (leave all unchecked for all severities)</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input severity-check" type="checkbox" value="critical" id="severity_critical">
|
||||
<label class="form-check-label" for="severity_critical">Critical</label>
|
||||
</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input severity-check" type="checkbox" value="warning" id="severity_warning">
|
||||
<label class="form-check-label" for="severity_warning">Warning</label>
|
||||
</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input severity-check" type="checkbox" value="info" id="severity_info">
|
||||
<label class="form-check-label" for="severity_info">Info</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="my-4">
|
||||
|
||||
<!-- Webhook Template -->
|
||||
<h5 class="card-title mb-3">Webhook Template</h5>
|
||||
<div class="alert alert-info small">
|
||||
<i class="bi bi-info-circle"></i>
|
||||
Customize the webhook payload using Jinja2 templates. Leave empty to use the default JSON format.
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="preset_selector" class="form-label">Load Preset Template</label>
|
||||
<select class="form-select" id="preset_selector">
|
||||
<option value="">-- Select a preset --</option>
|
||||
</select>
|
||||
<div class="form-text">Choose from pre-built templates for popular services</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="template_format" class="form-label">Template Format</label>
|
||||
<select class="form-select" id="template_format" name="template_format">
|
||||
<option value="json">JSON</option>
|
||||
<option value="text">Plain Text</option>
|
||||
</select>
|
||||
<div class="form-text">Output format of the rendered template</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="template" class="form-label">Template</label>
|
||||
<textarea class="form-control font-monospace" id="template" name="template" rows="12"
|
||||
placeholder="Leave empty for default format, or enter custom Jinja2 template..."></textarea>
|
||||
<div class="form-text">
|
||||
Available variables: <code>{{ "{{" }} alert.* {{ "}}" }}</code>, <code>{{ "{{" }} scan.* {{ "}}" }}</code>, <code>{{ "{{" }} rule.* {{ "}}" }}</code>
|
||||
<a href="#" data-bs-toggle="modal" data-bs-target="#variablesModal">View all variables</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<label for="content_type_override" class="form-label">Custom Content-Type (optional)</label>
|
||||
<input type="text" class="form-control font-monospace" id="content_type_override" name="content_type_override"
|
||||
placeholder="e.g., application/json, text/plain, text/markdown">
|
||||
<div class="form-text">Override the default Content-Type header (auto-detected from template format if not set)</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<button type="button" class="btn btn-outline-secondary btn-sm" id="preview-template-btn">
|
||||
<i class="bi bi-eye"></i> Preview Template
|
||||
</button>
|
||||
<button type="button" class="btn btn-outline-secondary btn-sm ms-2" id="clear-template-btn">
|
||||
<i class="bi bi-x-circle"></i> Clear Template
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<hr class="my-4">
|
||||
|
||||
<!-- Advanced Settings -->
|
||||
<h5 class="card-title mb-3">Advanced Settings</h5>
|
||||
|
||||
<div class="row">
|
||||
<div class="col-md-6 mb-3">
|
||||
<label for="timeout" class="form-label">Timeout (seconds)</label>
|
||||
<input type="number" class="form-control" id="timeout" name="timeout" min="1" max="60" value="10">
|
||||
<div class="form-text">Maximum time to wait for response</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-6 mb-3">
|
||||
<label for="retry_count" class="form-label">Retry Count</label>
|
||||
<input type="number" class="form-control" id="retry_count" name="retry_count" min="0" max="5" value="3">
|
||||
<div class="form-text">Number of retry attempts on failure</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="my-4">
|
||||
|
||||
<!-- Submit Buttons -->
|
||||
<div class="d-flex justify-content-between">
|
||||
<a href="{{ url_for('webhooks.list_webhooks') }}" class="btn btn-secondary">Cancel</a>
|
||||
<div>
|
||||
<button type="button" class="btn btn-outline-primary me-2" id="test-btn">
|
||||
<i class="bi bi-send"></i> Test Webhook
|
||||
</button>
|
||||
<button type="submit" class="btn btn-primary">
|
||||
<i class="bi bi-check-circle"></i> Save Webhook
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Help Sidebar -->
|
||||
<div class="col-lg-4">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title"><i class="bi bi-info-circle"></i> Help</h5>
|
||||
|
||||
<h6 class="mt-3">Payload Format</h6>
|
||||
<p class="small text-muted">Default JSON payload format (can be customized with templates):</p>
|
||||
<pre class="small bg-dark text-light p-2 rounded"><code>{
|
||||
"event": "alert.created",
|
||||
"alert": {
|
||||
"id": 123,
|
||||
"type": "cert_expiry",
|
||||
"severity": "warning",
|
||||
"message": "...",
|
||||
"ip_address": "192.168.1.10",
|
||||
"port": 443
|
||||
},
|
||||
"scan": {...},
|
||||
"rule": {...}
|
||||
}</code></pre>
|
||||
|
||||
<h6 class="mt-3">Custom Templates</h6>
|
||||
<p class="small text-muted">Use Jinja2 templates to customize payloads for services like Slack, Discord, Gotify, or create your own format. Select a preset or write a custom template.</p>
|
||||
|
||||
<h6 class="mt-3">Authentication Types</h6>
|
||||
<ul class="small">
|
||||
<li><strong>None:</strong> No authentication</li>
|
||||
<li><strong>Bearer:</strong> Add Authorization header with token</li>
|
||||
<li><strong>Basic:</strong> Use username:password format</li>
|
||||
<li><strong>Custom:</strong> Define custom HTTP headers</li>
|
||||
</ul>
|
||||
|
||||
<h6 class="mt-3">Retry Logic</h6>
|
||||
<p class="small text-muted">Failed webhooks are retried with exponential backoff (2^attempt seconds, max 60s).</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Template Variables Modal -->
|
||||
<div class="modal fade" id="variablesModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Available Template Variables</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<h6>Alert Variables</h6>
|
||||
<ul class="small font-monospace">
|
||||
<li>{{ "{{" }} alert.id {{ "}}" }} - Alert ID</li>
|
||||
<li>{{ "{{" }} alert.type {{ "}}" }} - Alert type (unexpected_port, cert_expiry, etc.)</li>
|
||||
<li>{{ "{{" }} alert.severity {{ "}}" }} - Severity level (critical, warning, info)</li>
|
||||
<li>{{ "{{" }} alert.message {{ "}}" }} - Human-readable alert message</li>
|
||||
<li>{{ "{{" }} alert.ip_address {{ "}}" }} - IP address (if applicable)</li>
|
||||
<li>{{ "{{" }} alert.port {{ "}}" }} - Port number (if applicable)</li>
|
||||
<li>{{ "{{" }} alert.acknowledged {{ "}}" }} - Boolean: is acknowledged</li>
|
||||
<li>{{ "{{" }} alert.created_at {{ "}}" }} - Alert creation timestamp</li>
|
||||
</ul>
|
||||
|
||||
<h6 class="mt-3">Scan Variables</h6>
|
||||
<ul class="small font-monospace">
|
||||
<li>{{ "{{" }} scan.id {{ "}}" }} - Scan ID</li>
|
||||
<li>{{ "{{" }} scan.title {{ "}}" }} - Scan title from config</li>
|
||||
<li>{{ "{{" }} scan.timestamp {{ "}}" }} - Scan start time</li>
|
||||
<li>{{ "{{" }} scan.duration {{ "}}" }} - Scan duration in seconds</li>
|
||||
<li>{{ "{{" }} scan.status {{ "}}" }} - Scan status (running, completed, failed)</li>
|
||||
<li>{{ "{{" }} scan.triggered_by {{ "}}" }} - How scan was triggered (manual, scheduled, api)</li>
|
||||
</ul>
|
||||
|
||||
<h6 class="mt-3">Rule Variables</h6>
|
||||
<ul class="small font-monospace">
|
||||
<li>{{ "{{" }} rule.id {{ "}}" }} - Rule ID</li>
|
||||
<li>{{ "{{" }} rule.name {{ "}}" }} - Rule name</li>
|
||||
<li>{{ "{{" }} rule.type {{ "}}" }} - Rule type</li>
|
||||
<li>{{ "{{" }} rule.threshold {{ "}}" }} - Rule threshold value</li>
|
||||
<li>{{ "{{" }} rule.severity {{ "}}" }} - Rule severity</li>
|
||||
</ul>
|
||||
|
||||
<h6 class="mt-3">App Variables</h6>
|
||||
<ul class="small font-monospace">
|
||||
<li>{{ "{{" }} app.name {{ "}}" }} - Application name</li>
|
||||
<li>{{ "{{" }} app.version {{ "}}" }} - Application version</li>
|
||||
<li>{{ "{{" }} app.url {{ "}}" }} - Repository URL</li>
|
||||
</ul>
|
||||
|
||||
<h6 class="mt-3">Other Variables</h6>
|
||||
<ul class="small font-monospace">
|
||||
<li>{{ "{{" }} timestamp {{ "}}" }} - Current UTC timestamp</li>
|
||||
</ul>
|
||||
|
||||
<h6 class="mt-3">Jinja2 Features</h6>
|
||||
<p class="small">Templates support Jinja2 syntax including:</p>
|
||||
<ul class="small">
|
||||
<li>Conditionals: <code>{{ "{%" }} if alert.severity == 'critical' {{ "%}" }}...{{ "{%" }} endif {{ "%}" }}</code></li>
|
||||
<li>Filters: <code>{{ "{{" }} alert.type|upper {{ "}}" }}</code>, <code>{{ "{{" }} alert.created_at.isoformat() {{ "}}" }}</code></li>
|
||||
<li>Default values: <code>{{ "{{" }} alert.port|default('N/A') {{ "}}" }}</code></li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Template Preview Modal -->
|
||||
<div class="modal fade" id="previewModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Template Preview</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p class="small text-muted">Preview using sample data:</p>
|
||||
<pre class="bg-dark text-light p-3 rounded" id="preview-output" style="max-height: 500px; overflow-y: auto;"><code></code></pre>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
const webhookId = {{ webhook.id if webhook else 'null' }};
|
||||
const mode = '{{ mode }}';
|
||||
|
||||
// Load template presets on page load
|
||||
async function loadPresets() {
|
||||
try {
|
||||
const response = await fetch('/api/webhooks/template-presets');
|
||||
const data = await response.json();
|
||||
|
||||
if (data.status === 'success') {
|
||||
const selector = document.getElementById('preset_selector');
|
||||
data.presets.forEach(preset => {
|
||||
const option = document.createElement('option');
|
||||
option.value = JSON.stringify({
|
||||
template: preset.template,
|
||||
format: preset.format,
|
||||
content_type: preset.content_type
|
||||
});
|
||||
option.textContent = `${preset.name} - ${preset.description}`;
|
||||
selector.appendChild(option);
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to load presets:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Handle preset selection
|
||||
document.getElementById('preset_selector').addEventListener('change', function() {
|
||||
if (!this.value) return;
|
||||
|
||||
try {
|
||||
const preset = JSON.parse(this.value);
|
||||
document.getElementById('template').value = preset.template;
|
||||
document.getElementById('template_format').value = preset.format;
|
||||
document.getElementById('content_type_override').value = preset.content_type;
|
||||
} catch (error) {
|
||||
console.error('Failed to load preset:', error);
|
||||
}
|
||||
});
|
||||
|
||||
// Handle preview template button
|
||||
document.getElementById('preview-template-btn').addEventListener('click', async function() {
|
||||
const template = document.getElementById('template').value.trim();
|
||||
if (!template) {
|
||||
alert('Please enter a template first');
|
||||
return;
|
||||
}
|
||||
|
||||
const templateFormat = document.getElementById('template_format').value;
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/webhooks/preview-template', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
template: template,
|
||||
template_format: templateFormat
|
||||
})
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (result.status === 'success') {
|
||||
// Display preview in modal
|
||||
const output = document.querySelector('#preview-output code');
|
||||
if (templateFormat === 'json') {
|
||||
// Pretty print JSON
|
||||
try {
|
||||
const parsed = JSON.parse(result.rendered);
|
||||
output.textContent = JSON.stringify(parsed, null, 2);
|
||||
} catch (e) {
|
||||
output.textContent = result.rendered;
|
||||
}
|
||||
} else {
|
||||
output.textContent = result.rendered;
|
||||
}
|
||||
|
||||
// Show modal
|
||||
const modal = new bootstrap.Modal(document.getElementById('previewModal'));
|
||||
modal.show();
|
||||
} else {
|
||||
alert(`Preview failed: ${result.message}`);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error previewing template:', error);
|
||||
alert('Failed to preview template');
|
||||
}
|
||||
});
|
||||
|
||||
// Handle clear template button
|
||||
document.getElementById('clear-template-btn').addEventListener('click', function() {
|
||||
if (confirm('Clear template and reset to default format?')) {
|
||||
document.getElementById('template').value = '';
|
||||
document.getElementById('template_format').value = 'json';
|
||||
document.getElementById('content_type_override').value = '';
|
||||
document.getElementById('preset_selector').value = '';
|
||||
}
|
||||
});
|
||||
|
||||
// Load presets on page load
|
||||
loadPresets();
|
||||
|
||||
// Show/hide auth fields based on type
|
||||
document.getElementById('auth_type').addEventListener('change', function() {
|
||||
const authType = this.value;
|
||||
const tokenField = document.getElementById('auth_token_field');
|
||||
const headersField = document.getElementById('custom_headers_field');
|
||||
const tokenHelp = document.getElementById('auth_token_help');
|
||||
|
||||
tokenField.style.display = 'none';
|
||||
headersField.style.display = 'none';
|
||||
|
||||
if (authType === 'bearer') {
|
||||
tokenField.style.display = 'block';
|
||||
document.getElementById('auth_token').placeholder = 'Enter bearer token';
|
||||
tokenHelp.textContent = 'Bearer token for Authorization header (encrypted when stored)';
|
||||
} else if (authType === 'basic') {
|
||||
tokenField.style.display = 'block';
|
||||
document.getElementById('auth_token').placeholder = 'username:password';
|
||||
tokenHelp.textContent = 'Format: username:password (encrypted when stored)';
|
||||
} else if (authType === 'custom') {
|
||||
headersField.style.display = 'block';
|
||||
}
|
||||
});
|
||||
|
||||
// Load existing webhook data if editing
|
||||
if (mode === 'edit' && webhookId) {
|
||||
loadWebhookData(webhookId);
|
||||
}
|
||||
|
||||
async function loadWebhookData(id) {
|
||||
try {
|
||||
const response = await fetch(`/api/webhooks/${id}`);
|
||||
const data = await response.json();
|
||||
const webhook = data.webhook;
|
||||
|
||||
// Populate form fields
|
||||
document.getElementById('name').value = webhook.name || '';
|
||||
document.getElementById('url').value = webhook.url || '';
|
||||
document.getElementById('enabled').checked = webhook.enabled;
|
||||
document.getElementById('auth_type').value = webhook.auth_type || 'none';
|
||||
document.getElementById('timeout').value = webhook.timeout || 10;
|
||||
document.getElementById('retry_count').value = webhook.retry_count || 3;
|
||||
|
||||
// Trigger auth type change to show relevant fields
|
||||
document.getElementById('auth_type').dispatchEvent(new Event('change'));
|
||||
|
||||
// Don't populate auth_token (it's encrypted)
|
||||
if (webhook.custom_headers) {
|
||||
document.getElementById('custom_headers').value = JSON.stringify(webhook.custom_headers, null, 2);
|
||||
}
|
||||
|
||||
// Check alert types
|
||||
if (webhook.alert_types && webhook.alert_types.length > 0) {
|
||||
webhook.alert_types.forEach(type => {
|
||||
const checkbox = document.querySelector(`.alert-type-check[value="${type}"]`);
|
||||
if (checkbox) checkbox.checked = true;
|
||||
});
|
||||
}
|
||||
|
||||
// Check severities
|
||||
if (webhook.severity_filter && webhook.severity_filter.length > 0) {
|
||||
webhook.severity_filter.forEach(sev => {
|
||||
const checkbox = document.querySelector(`.severity-check[value="${sev}"]`);
|
||||
if (checkbox) checkbox.checked = true;
|
||||
});
|
||||
}
|
||||
|
||||
// Load template fields
|
||||
if (webhook.template) {
|
||||
document.getElementById('template').value = webhook.template;
|
||||
}
|
||||
if (webhook.template_format) {
|
||||
document.getElementById('template_format').value = webhook.template_format;
|
||||
}
|
||||
if (webhook.content_type_override) {
|
||||
document.getElementById('content_type_override').value = webhook.content_type_override;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading webhook:', error);
|
||||
alert('Failed to load webhook data');
|
||||
}
|
||||
}
|
||||
|
||||
// Form submission
|
||||
document.getElementById('webhook-form').addEventListener('submit', async function(e) {
|
||||
e.preventDefault();
|
||||
|
||||
const formData = {
|
||||
name: document.getElementById('name').value,
|
||||
url: document.getElementById('url').value,
|
||||
enabled: document.getElementById('enabled').checked,
|
||||
auth_type: document.getElementById('auth_type').value,
|
||||
timeout: parseInt(document.getElementById('timeout').value),
|
||||
retry_count: parseInt(document.getElementById('retry_count').value)
|
||||
};
|
||||
|
||||
// Add auth token if provided
|
||||
const authToken = document.getElementById('auth_token').value;
|
||||
if (authToken) {
|
||||
formData.auth_token = authToken;
|
||||
}
|
||||
|
||||
// Add custom headers if provided
|
||||
const customHeaders = document.getElementById('custom_headers').value;
|
||||
if (customHeaders.trim()) {
|
||||
try {
|
||||
formData.custom_headers = JSON.parse(customHeaders);
|
||||
} catch (e) {
|
||||
alert('Invalid JSON in custom headers');
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Collect selected alert types
|
||||
const alertTypes = Array.from(document.querySelectorAll('.alert-type-check:checked'))
|
||||
.map(cb => cb.value);
|
||||
if (alertTypes.length > 0) {
|
||||
formData.alert_types = alertTypes;
|
||||
}
|
||||
|
||||
// Collect selected severities
|
||||
const severities = Array.from(document.querySelectorAll('.severity-check:checked'))
|
||||
.map(cb => cb.value);
|
||||
if (severities.length > 0) {
|
||||
formData.severity_filter = severities;
|
||||
}
|
||||
|
||||
// Add template fields
|
||||
const template = document.getElementById('template').value.trim();
|
||||
if (template) {
|
||||
formData.template = template;
|
||||
formData.template_format = document.getElementById('template_format').value;
|
||||
|
||||
const contentTypeOverride = document.getElementById('content_type_override').value.trim();
|
||||
if (contentTypeOverride) {
|
||||
formData.content_type_override = contentTypeOverride;
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
const url = mode === 'edit' ? `/api/webhooks/${webhookId}` : '/api/webhooks';
|
||||
const method = mode === 'edit' ? 'PUT' : 'POST';
|
||||
|
||||
const response = await fetch(url, {
|
||||
method: method,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(formData)
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (result.status === 'success') {
|
||||
alert('Webhook saved successfully!');
|
||||
window.location.href = '{{ url_for("webhooks.list_webhooks") }}';
|
||||
} else {
|
||||
alert(`Failed to save webhook: ${result.message}`);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error saving webhook:', error);
|
||||
alert('Failed to save webhook');
|
||||
}
|
||||
});
|
||||
|
||||
// Test webhook button
|
||||
document.getElementById('test-btn').addEventListener('click', async function() {
|
||||
if (mode !== 'edit' || !webhookId) {
|
||||
alert('Please save the webhook first before testing');
|
||||
return;
|
||||
}
|
||||
|
||||
if (!confirm('Send a test payload to this webhook?')) return;
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/webhooks/${webhookId}/test`, { method: 'POST' });
|
||||
const result = await response.json();
|
||||
|
||||
if (result.status === 'success') {
|
||||
alert(`Test successful!\nHTTP ${result.status_code}\n${result.message}`);
|
||||
} else {
|
||||
alert(`Test failed:\n${result.message}`);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error testing webhook:', error);
|
||||
alert('Failed to test webhook');
|
||||
}
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
250
app/web/templates/webhooks/list.html
Normal file
250
app/web/templates/webhooks/list.html
Normal file
@@ -0,0 +1,250 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Webhooks - SneakyScanner{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12 d-flex justify-content-between align-items-center mb-4">
|
||||
<h1>Webhook Management</h1>
|
||||
<a href="{{ url_for('webhooks.new_webhook') }}" class="btn btn-primary">
|
||||
<i class="bi bi-plus-circle"></i> Add Webhook
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Loading indicator -->
|
||||
<div id="loading" class="text-center my-5">
|
||||
<div class="spinner-border text-primary" role="status">
|
||||
<span class="visually-hidden">Loading...</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Webhooks table -->
|
||||
<div id="webhooks-container" style="display: none;">
|
||||
<div class="row mb-4">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Name</th>
|
||||
<th>URL</th>
|
||||
<th>Alert Types</th>
|
||||
<th>Severity Filter</th>
|
||||
<th>Status</th>
|
||||
<th>Actions</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="webhooks-tbody">
|
||||
<!-- Populated via JavaScript -->
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Pagination -->
|
||||
<nav aria-label="Webhooks pagination" id="pagination-container">
|
||||
<ul class="pagination justify-content-center" id="pagination">
|
||||
<!-- Populated via JavaScript -->
|
||||
</ul>
|
||||
</nav>
|
||||
</div>
|
||||
|
||||
<!-- Empty state -->
|
||||
<div id="empty-state" class="text-center my-5" style="display: none;">
|
||||
<i class="bi bi-webhook" style="font-size: 4rem; color: #94a3b8;"></i>
|
||||
<p class="text-muted mt-3">No webhooks configured yet.</p>
|
||||
<a href="{{ url_for('webhooks.new_webhook') }}" class="btn btn-primary">
|
||||
<i class="bi bi-plus-circle"></i> Create Your First Webhook
|
||||
</a>
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
let currentPage = 1;
|
||||
const perPage = 20;
|
||||
|
||||
async function loadWebhooks(page = 1) {
|
||||
try {
|
||||
const response = await fetch(`/api/webhooks?page=${page}&per_page=${perPage}`);
|
||||
const data = await response.json();
|
||||
|
||||
if (data.webhooks && data.webhooks.length > 0) {
|
||||
renderWebhooks(data.webhooks);
|
||||
renderPagination(data.page, data.pages, data.total);
|
||||
document.getElementById('webhooks-container').style.display = 'block';
|
||||
document.getElementById('empty-state').style.display = 'none';
|
||||
} else {
|
||||
document.getElementById('webhooks-container').style.display = 'none';
|
||||
document.getElementById('empty-state').style.display = 'block';
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading webhooks:', error);
|
||||
alert('Failed to load webhooks');
|
||||
} finally {
|
||||
document.getElementById('loading').style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
function renderWebhooks(webhooks) {
|
||||
const tbody = document.getElementById('webhooks-tbody');
|
||||
tbody.innerHTML = '';
|
||||
|
||||
webhooks.forEach(webhook => {
|
||||
const row = document.createElement('tr');
|
||||
|
||||
// Truncate URL for display
|
||||
const truncatedUrl = webhook.url.length > 50 ?
|
||||
webhook.url.substring(0, 47) + '...' : webhook.url;
|
||||
|
||||
// Format alert types
|
||||
const alertTypes = webhook.alert_types && webhook.alert_types.length > 0 ?
|
||||
webhook.alert_types.map(t => `<span class="badge bg-secondary me-1">${t}</span>`).join('') :
|
||||
'<span class="text-muted">All</span>';
|
||||
|
||||
// Format severity filter
|
||||
const severityFilter = webhook.severity_filter && webhook.severity_filter.length > 0 ?
|
||||
webhook.severity_filter.map(s => `<span class="badge bg-${getSeverityColor(s)} me-1">${s}</span>`).join('') :
|
||||
'<span class="text-muted">All</span>';
|
||||
|
||||
// Status badge
|
||||
const statusBadge = webhook.enabled ?
|
||||
'<span class="badge bg-success">Enabled</span>' :
|
||||
'<span class="badge bg-secondary">Disabled</span>';
|
||||
|
||||
row.innerHTML = `
|
||||
<td><strong>${escapeHtml(webhook.name)}</strong></td>
|
||||
<td><code class="small">${escapeHtml(truncatedUrl)}</code></td>
|
||||
<td>${alertTypes}</td>
|
||||
<td>${severityFilter}</td>
|
||||
<td>${statusBadge}</td>
|
||||
<td>
|
||||
<div class="btn-group btn-group-sm" role="group">
|
||||
<button class="btn btn-outline-primary" onclick="testWebhook(${webhook.id})" title="Test">
|
||||
<i class="bi bi-send"></i>
|
||||
</button>
|
||||
<a href="/webhooks/${webhook.id}/edit" class="btn btn-outline-primary" title="Edit">
|
||||
<i class="bi bi-pencil"></i>
|
||||
</a>
|
||||
<a href="/webhooks/${webhook.id}/logs" class="btn btn-outline-info" title="Logs">
|
||||
<i class="bi bi-list-ul"></i>
|
||||
</a>
|
||||
<button class="btn btn-outline-danger" onclick="deleteWebhook(${webhook.id}, '${escapeHtml(webhook.name)}')" title="Delete">
|
||||
<i class="bi bi-trash"></i>
|
||||
</button>
|
||||
</div>
|
||||
</td>
|
||||
`;
|
||||
tbody.appendChild(row);
|
||||
});
|
||||
}
|
||||
|
||||
function renderPagination(currentPage, totalPages, totalItems) {
|
||||
const pagination = document.getElementById('pagination');
|
||||
pagination.innerHTML = '';
|
||||
|
||||
if (totalPages <= 1) {
|
||||
document.getElementById('pagination-container').style.display = 'none';
|
||||
return;
|
||||
}
|
||||
|
||||
document.getElementById('pagination-container').style.display = 'block';
|
||||
|
||||
// Previous button
|
||||
const prevLi = document.createElement('li');
|
||||
prevLi.className = `page-item ${currentPage === 1 ? 'disabled' : ''}`;
|
||||
prevLi.innerHTML = `<a class="page-link" href="#" onclick="changePage(${currentPage - 1}); return false;">Previous</a>`;
|
||||
pagination.appendChild(prevLi);
|
||||
|
||||
// Page numbers
|
||||
for (let i = 1; i <= totalPages; i++) {
|
||||
if (i === 1 || i === totalPages || (i >= currentPage - 2 && i <= currentPage + 2)) {
|
||||
const li = document.createElement('li');
|
||||
li.className = `page-item ${i === currentPage ? 'active' : ''}`;
|
||||
li.innerHTML = `<a class="page-link" href="#" onclick="changePage(${i}); return false;">${i}</a>`;
|
||||
pagination.appendChild(li);
|
||||
} else if (i === currentPage - 3 || i === currentPage + 3) {
|
||||
const li = document.createElement('li');
|
||||
li.className = 'page-item disabled';
|
||||
li.innerHTML = '<a class="page-link" href="#">...</a>';
|
||||
pagination.appendChild(li);
|
||||
}
|
||||
}
|
||||
|
||||
// Next button
|
||||
const nextLi = document.createElement('li');
|
||||
nextLi.className = `page-item ${currentPage === totalPages ? 'disabled' : ''}`;
|
||||
nextLi.innerHTML = `<a class="page-link" href="#" onclick="changePage(${currentPage + 1}); return false;">Next</a>`;
|
||||
pagination.appendChild(nextLi);
|
||||
}
|
||||
|
||||
function changePage(page) {
|
||||
currentPage = page;
|
||||
loadWebhooks(page);
|
||||
}
|
||||
|
||||
async function testWebhook(id) {
|
||||
if (!confirm('Send a test payload to this webhook?')) return;
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/webhooks/${id}/test`, { method: 'POST' });
|
||||
const result = await response.json();
|
||||
|
||||
if (result.status === 'success') {
|
||||
alert(`Test successful!\nHTTP ${result.status_code}\n${result.message}`);
|
||||
} else {
|
||||
alert(`Test failed:\n${result.message}`);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error testing webhook:', error);
|
||||
alert('Failed to test webhook');
|
||||
}
|
||||
}
|
||||
|
||||
async function deleteWebhook(id, name) {
|
||||
if (!confirm(`Are you sure you want to delete webhook "${name}"?`)) return;
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/webhooks/${id}`, { method: 'DELETE' });
|
||||
const result = await response.json();
|
||||
|
||||
if (result.status === 'success') {
|
||||
alert('Webhook deleted successfully');
|
||||
loadWebhooks(currentPage);
|
||||
} else {
|
||||
alert(`Failed to delete webhook: ${result.message}`);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error deleting webhook:', error);
|
||||
alert('Failed to delete webhook');
|
||||
}
|
||||
}
|
||||
|
||||
function getSeverityColor(severity) {
|
||||
const colors = {
|
||||
'critical': 'danger',
|
||||
'warning': 'warning',
|
||||
'info': 'info'
|
||||
};
|
||||
return colors[severity] || 'secondary';
|
||||
}
|
||||
|
||||
function escapeHtml(text) {
|
||||
const div = document.createElement('div');
|
||||
div.textContent = text;
|
||||
return div.innerHTML;
|
||||
}
|
||||
|
||||
// Load webhooks on page load
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
loadWebhooks(1);
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
328
app/web/templates/webhooks/logs.html
Normal file
328
app/web/templates/webhooks/logs.html
Normal file
@@ -0,0 +1,328 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block title %}Webhook Logs - {{ webhook.name }} - SneakyScanner{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="row mt-4">
|
||||
<div class="col-12 mb-4">
|
||||
<h1 style="color: #60a5fa;">Webhook Delivery Logs</h1>
|
||||
<nav aria-label="breadcrumb">
|
||||
<ol class="breadcrumb">
|
||||
<li class="breadcrumb-item"><a href="{{ url_for('webhooks.list_webhooks') }}">Webhooks</a></li>
|
||||
<li class="breadcrumb-item active">{{ webhook.name }}</li>
|
||||
</ol>
|
||||
</nav>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Webhook Info -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<div class="row">
|
||||
<div class="col-md-6">
|
||||
<h5 class="card-title">{{ webhook.name }}</h5>
|
||||
<p class="text-muted mb-1"><strong>URL:</strong> <code>{{ webhook.url }}</code></p>
|
||||
<p class="text-muted mb-0">
|
||||
<strong>Status:</strong>
|
||||
{% if webhook.enabled %}
|
||||
<span class="badge bg-success">Enabled</span>
|
||||
{% else %}
|
||||
<span class="badge bg-secondary">Disabled</span>
|
||||
{% endif %}
|
||||
</p>
|
||||
</div>
|
||||
<div class="col-md-6 text-md-end">
|
||||
<a href="{{ url_for('webhooks.edit_webhook', webhook_id=webhook.id) }}" class="btn btn-primary">
|
||||
<i class="bi bi-pencil"></i> Edit Webhook
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Filters -->
|
||||
<div class="row mb-4">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<div class="row g-3">
|
||||
<div class="col-md-4">
|
||||
<label for="status-filter" class="form-label">Status</label>
|
||||
<select class="form-select" id="status-filter">
|
||||
<option value="">All</option>
|
||||
<option value="success">Success</option>
|
||||
<option value="failed">Failed</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-md-4 d-flex align-items-end">
|
||||
<button type="button" class="btn btn-primary w-100" onclick="applyFilter()">
|
||||
<i class="bi bi-funnel"></i> Apply Filter
|
||||
</button>
|
||||
</div>
|
||||
<div class="col-md-4 d-flex align-items-end">
|
||||
<button type="button" class="btn btn-outline-secondary w-100" onclick="refreshLogs()">
|
||||
<i class="bi bi-arrow-clockwise"></i> Refresh
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Loading indicator -->
|
||||
<div id="loading" class="text-center my-5">
|
||||
<div class="spinner-border text-primary" role="status">
|
||||
<span class="visually-hidden">Loading...</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Logs table -->
|
||||
<div id="logs-container" style="display: none;">
|
||||
<div class="row mb-4">
|
||||
<div class="col-12">
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Timestamp</th>
|
||||
<th>Alert</th>
|
||||
<th>Status</th>
|
||||
<th>HTTP Code</th>
|
||||
<th>Attempt</th>
|
||||
<th>Details</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="logs-tbody">
|
||||
<!-- Populated via JavaScript -->
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Pagination -->
|
||||
<nav aria-label="Logs pagination" id="pagination-container">
|
||||
<ul class="pagination justify-content-center" id="pagination">
|
||||
<!-- Populated via JavaScript -->
|
||||
</ul>
|
||||
</nav>
|
||||
</div>
|
||||
|
||||
<!-- Empty state -->
|
||||
<div id="empty-state" class="text-center my-5" style="display: none;">
|
||||
<i class="bi bi-list-ul" style="font-size: 4rem; color: #94a3b8;"></i>
|
||||
<p class="text-muted mt-3">No delivery logs yet.</p>
|
||||
<p class="small text-muted">Logs will appear here after alerts trigger this webhook.</p>
|
||||
</div>
|
||||
|
||||
<!-- Modal for log details -->
|
||||
<div class="modal fade" id="logDetailModal" tabindex="-1" aria-hidden="true">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Delivery Log Details</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div id="modal-content">
|
||||
<!-- Populated via JavaScript -->
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
const webhookId = {{ webhook.id }};
|
||||
let currentPage = 1;
|
||||
let currentStatus = '';
|
||||
const perPage = 20;
|
||||
|
||||
async function loadLogs(page = 1, status = '') {
|
||||
try {
|
||||
let url = `/api/webhooks/${webhookId}/logs?page=${page}&per_page=${perPage}`;
|
||||
if (status) {
|
||||
url += `&status=${status}`;
|
||||
}
|
||||
|
||||
const response = await fetch(url);
|
||||
const data = await response.json();
|
||||
|
||||
if (data.logs && data.logs.length > 0) {
|
||||
renderLogs(data.logs);
|
||||
renderPagination(data.page, data.pages, data.total);
|
||||
document.getElementById('logs-container').style.display = 'block';
|
||||
document.getElementById('empty-state').style.display = 'none';
|
||||
} else {
|
||||
document.getElementById('logs-container').style.display = 'none';
|
||||
document.getElementById('empty-state').style.display = 'block';
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading logs:', error);
|
||||
alert('Failed to load delivery logs');
|
||||
} finally {
|
||||
document.getElementById('loading').style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
function renderLogs(logs) {
|
||||
const tbody = document.getElementById('logs-tbody');
|
||||
tbody.innerHTML = '';
|
||||
|
||||
logs.forEach(log => {
|
||||
const row = document.createElement('tr');
|
||||
|
||||
// Format timestamp
|
||||
const timestamp = new Date(log.delivered_at).toLocaleString();
|
||||
|
||||
// Status badge
|
||||
const statusBadge = log.status === 'success' ?
|
||||
'<span class="badge bg-success">Success</span>' :
|
||||
'<span class="badge bg-danger">Failed</span>';
|
||||
|
||||
// HTTP code badge
|
||||
const httpBadge = log.response_code ?
|
||||
`<span class="badge ${log.response_code < 400 ? 'bg-success' : 'bg-danger'}">${log.response_code}</span>` :
|
||||
'<span class="text-muted">N/A</span>';
|
||||
|
||||
// Alert info
|
||||
const alertInfo = log.alert_type ?
|
||||
`<span class="badge bg-secondary">${log.alert_type}</span><br><small class="text-muted">${escapeHtml(log.alert_message || '')}</small>` :
|
||||
`<small class="text-muted">Alert #${log.alert_id}</small>`;
|
||||
|
||||
row.innerHTML = `
|
||||
<td><small>${timestamp}</small></td>
|
||||
<td>${alertInfo}</td>
|
||||
<td>${statusBadge}</td>
|
||||
<td>${httpBadge}</td>
|
||||
<td>${log.attempt_number || 1}</td>
|
||||
<td>
|
||||
<button class="btn btn-sm btn-outline-primary" onclick="showLogDetails(${JSON.stringify(log).replace(/"/g, '"')})">
|
||||
<i class="bi bi-eye"></i> View
|
||||
</button>
|
||||
</td>
|
||||
`;
|
||||
tbody.appendChild(row);
|
||||
});
|
||||
}
|
||||
|
||||
function renderPagination(currentPage, totalPages, totalItems) {
|
||||
const pagination = document.getElementById('pagination');
|
||||
pagination.innerHTML = '';
|
||||
|
||||
if (totalPages <= 1) {
|
||||
document.getElementById('pagination-container').style.display = 'none';
|
||||
return;
|
||||
}
|
||||
|
||||
document.getElementById('pagination-container').style.display = 'block';
|
||||
|
||||
// Previous button
|
||||
const prevLi = document.createElement('li');
|
||||
prevLi.className = `page-item ${currentPage === 1 ? 'disabled' : ''}`;
|
||||
prevLi.innerHTML = `<a class="page-link" href="#" onclick="changePage(${currentPage - 1}); return false;">Previous</a>`;
|
||||
pagination.appendChild(prevLi);
|
||||
|
||||
// Page numbers
|
||||
for (let i = 1; i <= totalPages; i++) {
|
||||
if (i === 1 || i === totalPages || (i >= currentPage - 2 && i <= currentPage + 2)) {
|
||||
const li = document.createElement('li');
|
||||
li.className = `page-item ${i === currentPage ? 'active' : ''}`;
|
||||
li.innerHTML = `<a class="page-link" href="#" onclick="changePage(${i}); return false;">${i}</a>`;
|
||||
pagination.appendChild(li);
|
||||
} else if (i === currentPage - 3 || i === currentPage + 3) {
|
||||
const li = document.createElement('li');
|
||||
li.className = 'page-item disabled';
|
||||
li.innerHTML = '<a class="page-link" href="#">...</a>';
|
||||
pagination.appendChild(li);
|
||||
}
|
||||
}
|
||||
|
||||
// Next button
|
||||
const nextLi = document.createElement('li');
|
||||
nextLi.className = `page-item ${currentPage === totalPages ? 'disabled' : ''}`;
|
||||
nextLi.innerHTML = `<a class="page-link" href="#" onclick="changePage(${currentPage + 1}); return false;">Next</a>`;
|
||||
pagination.appendChild(nextLi);
|
||||
}
|
||||
|
||||
function changePage(page) {
|
||||
currentPage = page;
|
||||
loadLogs(page, currentStatus);
|
||||
}
|
||||
|
||||
function applyFilter() {
|
||||
currentStatus = document.getElementById('status-filter').value;
|
||||
currentPage = 1;
|
||||
loadLogs(1, currentStatus);
|
||||
}
|
||||
|
||||
function refreshLogs() {
|
||||
loadLogs(currentPage, currentStatus);
|
||||
}
|
||||
|
||||
function showLogDetails(log) {
|
||||
const modalContent = document.getElementById('modal-content');
|
||||
|
||||
let detailsHTML = `
|
||||
<div class="mb-3">
|
||||
<strong>Log ID:</strong> ${log.id}<br>
|
||||
<strong>Alert ID:</strong> ${log.alert_id}<br>
|
||||
<strong>Status:</strong> <span class="badge ${log.status === 'success' ? 'bg-success' : 'bg-danger'}">${log.status}</span><br>
|
||||
<strong>HTTP Code:</strong> ${log.response_code || 'N/A'}<br>
|
||||
<strong>Attempt:</strong> ${log.attempt_number || 1}<br>
|
||||
<strong>Delivered At:</strong> ${new Date(log.delivered_at).toLocaleString()}
|
||||
</div>
|
||||
`;
|
||||
|
||||
if (log.response_body) {
|
||||
detailsHTML += `
|
||||
<div class="mb-3">
|
||||
<strong>Response Body:</strong>
|
||||
<pre class="bg-dark text-light p-2 rounded mt-2"><code>${escapeHtml(log.response_body)}</code></pre>
|
||||
</div>
|
||||
`;
|
||||
}
|
||||
|
||||
if (log.error_message) {
|
||||
detailsHTML += `
|
||||
<div class="mb-3">
|
||||
<strong>Error Message:</strong>
|
||||
<div class="alert alert-danger mt-2">${escapeHtml(log.error_message)}</div>
|
||||
</div>
|
||||
`;
|
||||
}
|
||||
|
||||
modalContent.innerHTML = detailsHTML;
|
||||
|
||||
const modal = new bootstrap.Modal(document.getElementById('logDetailModal'));
|
||||
modal.show();
|
||||
}
|
||||
|
||||
function escapeHtml(text) {
|
||||
if (!text) return '';
|
||||
const div = document.createElement('div');
|
||||
div.textContent = text;
|
||||
return div.innerHTML;
|
||||
}
|
||||
|
||||
// Load logs on page load
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
loadLogs(1);
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -4,7 +4,7 @@ Pagination utilities for SneakyScanner web application.
|
||||
Provides helper functions for paginating SQLAlchemy queries.
|
||||
"""
|
||||
|
||||
from typing import Any, Dict, List
|
||||
from typing import Any, Callable, Dict, List, Optional
|
||||
from sqlalchemy.orm import Query
|
||||
|
||||
|
||||
@@ -53,6 +53,46 @@ class PaginatedResult:
|
||||
"""Get next page number."""
|
||||
return self.page + 1 if self.has_next else None
|
||||
|
||||
@property
|
||||
def prev_num(self) -> int:
|
||||
"""Alias for prev_page (Flask-SQLAlchemy compatibility)."""
|
||||
return self.prev_page
|
||||
|
||||
@property
|
||||
def next_num(self) -> int:
|
||||
"""Alias for next_num (Flask-SQLAlchemy compatibility)."""
|
||||
return self.next_page
|
||||
|
||||
def iter_pages(self, left_edge=1, left_current=1, right_current=2, right_edge=1):
|
||||
"""
|
||||
Generate page numbers for pagination display.
|
||||
|
||||
Yields page numbers and None for gaps, compatible with Flask-SQLAlchemy's
|
||||
pagination.iter_pages() method.
|
||||
|
||||
Args:
|
||||
left_edge: Number of pages to show on the left edge
|
||||
left_current: Number of pages to show left of current page
|
||||
right_current: Number of pages to show right of current page
|
||||
right_edge: Number of pages to show on the right edge
|
||||
|
||||
Yields:
|
||||
int or None: Page number or None for gaps
|
||||
|
||||
Example:
|
||||
For 100 pages, current page 50:
|
||||
Yields: 1, None, 48, 49, 50, 51, 52, None, 100
|
||||
"""
|
||||
last = 0
|
||||
for num in range(1, self.pages + 1):
|
||||
if num <= left_edge or \
|
||||
(num > self.page - left_current - 1 and num < self.page + right_current) or \
|
||||
num > self.pages - right_edge:
|
||||
if last + 1 != num:
|
||||
yield None
|
||||
yield num
|
||||
last = num
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Convert to dictionary for API responses.
|
||||
@@ -74,6 +114,7 @@ class PaginatedResult:
|
||||
|
||||
|
||||
def paginate(query: Query, page: int = 1, per_page: int = 20,
|
||||
transform: Optional[Callable[[Any], Dict[str, Any]]] = None,
|
||||
max_per_page: int = 100) -> PaginatedResult:
|
||||
"""
|
||||
Paginate a SQLAlchemy query.
|
||||
@@ -82,6 +123,7 @@ def paginate(query: Query, page: int = 1, per_page: int = 20,
|
||||
query: SQLAlchemy query to paginate
|
||||
page: Page number (1-indexed, default: 1)
|
||||
per_page: Items per page (default: 20)
|
||||
transform: Optional function to transform each item (default: None)
|
||||
max_per_page: Maximum items per page (default: 100)
|
||||
|
||||
Returns:
|
||||
@@ -93,6 +135,11 @@ def paginate(query: Query, page: int = 1, per_page: int = 20,
|
||||
>>> result = paginate(query, page=1, per_page=20)
|
||||
>>> scans = result.items
|
||||
>>> total_pages = result.pages
|
||||
|
||||
>>> # With transform function
|
||||
>>> def scan_to_dict(scan):
|
||||
... return {'id': scan.id, 'name': scan.name}
|
||||
>>> result = paginate(query, page=1, per_page=20, transform=scan_to_dict)
|
||||
"""
|
||||
# Validate and sanitize parameters
|
||||
page = max(1, page) # Page must be at least 1
|
||||
@@ -107,6 +154,10 @@ def paginate(query: Query, page: int = 1, per_page: int = 20,
|
||||
# Execute query with limit and offset
|
||||
items = query.limit(per_page).offset(offset).all()
|
||||
|
||||
# Apply transform if provided
|
||||
if transform is not None:
|
||||
items = [transform(item) for item in items]
|
||||
|
||||
return PaginatedResult(
|
||||
items=items,
|
||||
total=total,
|
||||
|
||||
@@ -7,8 +7,8 @@ for sensitive values like passwords and API tokens.
|
||||
|
||||
import json
|
||||
import os
|
||||
from datetime import datetime
|
||||
from typing import Any, Dict, List, Optional
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
import bcrypt
|
||||
from cryptography.fernet import Fernet
|
||||
@@ -32,6 +32,11 @@ class SettingsManager:
|
||||
'encryption_key',
|
||||
}
|
||||
|
||||
# Keys that are read-only (managed by developer, not user-editable)
|
||||
READ_ONLY_KEYS = {
|
||||
'encryption_key',
|
||||
}
|
||||
|
||||
def __init__(self, db_session: Session, encryption_key: Optional[bytes] = None):
|
||||
"""
|
||||
Initialize the settings manager.
|
||||
@@ -69,11 +74,11 @@ class SettingsManager:
|
||||
return new_key
|
||||
|
||||
def _store_raw(self, key: str, value: str) -> None:
|
||||
"""Store a setting without encryption (internal use only)."""
|
||||
"""Store a setting without encryption (internal use only). Bypasses read-only check."""
|
||||
setting = self.db.query(Setting).filter_by(key=key).first()
|
||||
if setting:
|
||||
setting.value = value
|
||||
setting.updated_at = datetime.utcnow()
|
||||
setting.updated_at = datetime.now(timezone.utc)
|
||||
else:
|
||||
setting = Setting(key=key, value=value)
|
||||
self.db.add(setting)
|
||||
@@ -128,7 +133,11 @@ class SettingsManager:
|
||||
|
||||
return value
|
||||
|
||||
def set(self, key: str, value: Any, encrypt: bool = None) -> None:
|
||||
def _is_read_only(self, key: str) -> bool:
|
||||
"""Check if a setting key is read-only."""
|
||||
return key in self.READ_ONLY_KEYS
|
||||
|
||||
def set(self, key: str, value: Any, encrypt: bool = None, allow_read_only: bool = False) -> None:
|
||||
"""
|
||||
Set a setting value.
|
||||
|
||||
@@ -136,7 +145,15 @@ class SettingsManager:
|
||||
key: Setting key
|
||||
value: Setting value (will be JSON-encoded if dict/list)
|
||||
encrypt: Force encryption on/off (None = auto-detect from ENCRYPTED_KEYS)
|
||||
allow_read_only: If True, allows setting read-only keys (internal use only)
|
||||
|
||||
Raises:
|
||||
ValueError: If attempting to set a read-only key without allow_read_only=True
|
||||
"""
|
||||
# Prevent modification of read-only keys unless explicitly allowed
|
||||
if not allow_read_only and self._is_read_only(key):
|
||||
raise ValueError(f"Setting '{key}' is read-only and cannot be modified via API")
|
||||
|
||||
# Convert complex types to JSON
|
||||
if isinstance(value, (dict, list)):
|
||||
value_str = json.dumps(value)
|
||||
@@ -153,7 +170,7 @@ class SettingsManager:
|
||||
setting = self.db.query(Setting).filter_by(key=key).first()
|
||||
if setting:
|
||||
setting.value = value_str
|
||||
setting.updated_at = datetime.utcnow()
|
||||
setting.updated_at = datetime.now(timezone.utc)
|
||||
else:
|
||||
setting = Setting(key=key, value=value_str)
|
||||
self.db.add(setting)
|
||||
@@ -251,7 +268,8 @@ class SettingsManager:
|
||||
for key, value in defaults.items():
|
||||
# Only set if doesn't exist
|
||||
if self.db.query(Setting).filter_by(key=key).first() is None:
|
||||
self.set(key, value)
|
||||
# Use allow_read_only=True for initializing defaults
|
||||
self.set(key, value, allow_read_only=True)
|
||||
|
||||
|
||||
class PasswordManager:
|
||||
|
||||
@@ -1,89 +1,11 @@
|
||||
"""
|
||||
Input validation utilities for SneakyScanner web application.
|
||||
|
||||
Provides validation functions for API inputs, file paths, and data integrity.
|
||||
Provides validation functions for API inputs and data integrity.
|
||||
"""
|
||||
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
|
||||
import yaml
|
||||
|
||||
|
||||
def validate_config_file(file_path: str) -> tuple[bool, Optional[str]]:
|
||||
"""
|
||||
Validate that a configuration file exists and is valid YAML.
|
||||
|
||||
Args:
|
||||
file_path: Path to configuration file (absolute or relative filename)
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid, error_message)
|
||||
If valid, returns (True, None)
|
||||
If invalid, returns (False, error_message)
|
||||
|
||||
Examples:
|
||||
>>> validate_config_file('/app/configs/example.yaml')
|
||||
(True, None)
|
||||
>>> validate_config_file('example.yaml')
|
||||
(True, None)
|
||||
>>> validate_config_file('/nonexistent.yaml')
|
||||
(False, 'File does not exist: /nonexistent.yaml')
|
||||
"""
|
||||
# Check if path is provided
|
||||
if not file_path:
|
||||
return False, 'Config file path is required'
|
||||
|
||||
# If file_path is just a filename (not absolute), prepend configs directory
|
||||
if not file_path.startswith('/'):
|
||||
file_path = f'/app/configs/{file_path}'
|
||||
|
||||
# Convert to Path object
|
||||
path = Path(file_path)
|
||||
|
||||
# Check if file exists
|
||||
if not path.exists():
|
||||
return False, f'File does not exist: {file_path}'
|
||||
|
||||
# Check if it's a file (not directory)
|
||||
if not path.is_file():
|
||||
return False, f'Path is not a file: {file_path}'
|
||||
|
||||
# Check file extension
|
||||
if path.suffix.lower() not in ['.yaml', '.yml']:
|
||||
return False, f'File must be YAML (.yaml or .yml): {file_path}'
|
||||
|
||||
# Try to parse as YAML
|
||||
try:
|
||||
with open(path, 'r') as f:
|
||||
config = yaml.safe_load(f)
|
||||
|
||||
# Check if it's a dictionary (basic structure validation)
|
||||
if not isinstance(config, dict):
|
||||
return False, 'Config file must contain a YAML dictionary'
|
||||
|
||||
# Check for required top-level keys
|
||||
if 'title' not in config:
|
||||
return False, 'Config file missing required "title" field'
|
||||
|
||||
if 'sites' not in config:
|
||||
return False, 'Config file missing required "sites" field'
|
||||
|
||||
# Validate sites structure
|
||||
if not isinstance(config['sites'], list):
|
||||
return False, '"sites" must be a list'
|
||||
|
||||
if len(config['sites']) == 0:
|
||||
return False, '"sites" list cannot be empty'
|
||||
|
||||
except yaml.YAMLError as e:
|
||||
return False, f'Invalid YAML syntax: {str(e)}'
|
||||
except Exception as e:
|
||||
return False, f'Error reading config file: {str(e)}'
|
||||
|
||||
return True, None
|
||||
|
||||
|
||||
def validate_scan_status(status: str) -> tuple[bool, Optional[str]]:
|
||||
"""
|
||||
@@ -101,190 +23,9 @@ def validate_scan_status(status: str) -> tuple[bool, Optional[str]]:
|
||||
>>> validate_scan_status('invalid')
|
||||
(False, 'Invalid status: invalid. Must be one of: running, completed, failed')
|
||||
"""
|
||||
valid_statuses = ['running', 'completed', 'failed']
|
||||
valid_statuses = ['running', 'finalizing', 'completed', 'failed', 'cancelled']
|
||||
|
||||
if status not in valid_statuses:
|
||||
return False, f'Invalid status: {status}. Must be one of: {", ".join(valid_statuses)}'
|
||||
|
||||
return True, None
|
||||
|
||||
|
||||
def validate_triggered_by(triggered_by: str) -> tuple[bool, Optional[str]]:
|
||||
"""
|
||||
Validate triggered_by value.
|
||||
|
||||
Args:
|
||||
triggered_by: Source that triggered the scan
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid, error_message)
|
||||
|
||||
Examples:
|
||||
>>> validate_triggered_by('manual')
|
||||
(True, None)
|
||||
>>> validate_triggered_by('api')
|
||||
(True, None)
|
||||
"""
|
||||
valid_sources = ['manual', 'scheduled', 'api']
|
||||
|
||||
if triggered_by not in valid_sources:
|
||||
return False, f'Invalid triggered_by: {triggered_by}. Must be one of: {", ".join(valid_sources)}'
|
||||
|
||||
return True, None
|
||||
|
||||
|
||||
def validate_scan_id(scan_id: any) -> tuple[bool, Optional[str]]:
|
||||
"""
|
||||
Validate scan ID is a positive integer.
|
||||
|
||||
Args:
|
||||
scan_id: Scan ID to validate
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid, error_message)
|
||||
|
||||
Examples:
|
||||
>>> validate_scan_id(42)
|
||||
(True, None)
|
||||
>>> validate_scan_id('42')
|
||||
(True, None)
|
||||
>>> validate_scan_id(-1)
|
||||
(False, 'Scan ID must be a positive integer')
|
||||
"""
|
||||
try:
|
||||
scan_id_int = int(scan_id)
|
||||
if scan_id_int <= 0:
|
||||
return False, 'Scan ID must be a positive integer'
|
||||
except (ValueError, TypeError):
|
||||
return False, f'Invalid scan ID: {scan_id}'
|
||||
|
||||
return True, None
|
||||
|
||||
|
||||
def validate_file_path(file_path: str, must_exist: bool = True) -> tuple[bool, Optional[str]]:
|
||||
"""
|
||||
Validate a file path.
|
||||
|
||||
Args:
|
||||
file_path: Path to validate
|
||||
must_exist: If True, file must exist. If False, only validate format.
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid, error_message)
|
||||
|
||||
Examples:
|
||||
>>> validate_file_path('/app/output/scan.json', must_exist=False)
|
||||
(True, None)
|
||||
>>> validate_file_path('', must_exist=False)
|
||||
(False, 'File path is required')
|
||||
"""
|
||||
if not file_path:
|
||||
return False, 'File path is required'
|
||||
|
||||
# Check for path traversal attempts
|
||||
if '..' in file_path:
|
||||
return False, 'Path traversal not allowed'
|
||||
|
||||
if must_exist:
|
||||
path = Path(file_path)
|
||||
if not path.exists():
|
||||
return False, f'File does not exist: {file_path}'
|
||||
if not path.is_file():
|
||||
return False, f'Path is not a file: {file_path}'
|
||||
|
||||
return True, None
|
||||
|
||||
|
||||
def sanitize_filename(filename: str) -> str:
|
||||
"""
|
||||
Sanitize a filename by removing/replacing unsafe characters.
|
||||
|
||||
Args:
|
||||
filename: Original filename
|
||||
|
||||
Returns:
|
||||
Sanitized filename safe for filesystem
|
||||
|
||||
Examples:
|
||||
>>> sanitize_filename('my scan.json')
|
||||
'my_scan.json'
|
||||
>>> sanitize_filename('../../etc/passwd')
|
||||
'etc_passwd'
|
||||
"""
|
||||
# Remove path components
|
||||
filename = os.path.basename(filename)
|
||||
|
||||
# Replace unsafe characters with underscore
|
||||
unsafe_chars = ['/', '\\', '..', ' ', ':', '*', '?', '"', '<', '>', '|']
|
||||
for char in unsafe_chars:
|
||||
filename = filename.replace(char, '_')
|
||||
|
||||
# Remove leading/trailing underscores and dots
|
||||
filename = filename.strip('_.')
|
||||
|
||||
# Ensure filename is not empty
|
||||
if not filename:
|
||||
filename = 'unnamed'
|
||||
|
||||
return filename
|
||||
|
||||
|
||||
def validate_port(port: any) -> tuple[bool, Optional[str]]:
|
||||
"""
|
||||
Validate port number.
|
||||
|
||||
Args:
|
||||
port: Port number to validate
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid, error_message)
|
||||
|
||||
Examples:
|
||||
>>> validate_port(443)
|
||||
(True, None)
|
||||
>>> validate_port(70000)
|
||||
(False, 'Port must be between 1 and 65535')
|
||||
"""
|
||||
try:
|
||||
port_int = int(port)
|
||||
if port_int < 1 or port_int > 65535:
|
||||
return False, 'Port must be between 1 and 65535'
|
||||
except (ValueError, TypeError):
|
||||
return False, f'Invalid port: {port}'
|
||||
|
||||
return True, None
|
||||
|
||||
|
||||
def validate_ip_address(ip: str) -> tuple[bool, Optional[str]]:
|
||||
"""
|
||||
Validate IPv4 address format (basic validation).
|
||||
|
||||
Args:
|
||||
ip: IP address string
|
||||
|
||||
Returns:
|
||||
Tuple of (is_valid, error_message)
|
||||
|
||||
Examples:
|
||||
>>> validate_ip_address('192.168.1.1')
|
||||
(True, None)
|
||||
>>> validate_ip_address('256.1.1.1')
|
||||
(False, 'Invalid IP address format')
|
||||
"""
|
||||
if not ip:
|
||||
return False, 'IP address is required'
|
||||
|
||||
# Basic IPv4 validation
|
||||
parts = ip.split('.')
|
||||
if len(parts) != 4:
|
||||
return False, 'Invalid IP address format'
|
||||
|
||||
try:
|
||||
for part in parts:
|
||||
num = int(part)
|
||||
if num < 0 or num > 255:
|
||||
return False, 'Invalid IP address format'
|
||||
except (ValueError, TypeError):
|
||||
return False, 'Invalid IP address format'
|
||||
|
||||
return True, None
|
||||
|
||||
73
destroy_everything.sh
Executable file
73
destroy_everything.sh
Executable file
@@ -0,0 +1,73 @@
|
||||
#!/bin/bash
|
||||
|
||||
# SneakyScan Fresh Start Script
|
||||
# This script removes all data, configs, and scan output for a clean slate
|
||||
|
||||
set -e
|
||||
|
||||
# Check for root/sudo access
|
||||
if [ "$EUID" -ne 0 ]; then
|
||||
echo "============================================"
|
||||
echo " ERROR: Root access required"
|
||||
echo "============================================"
|
||||
echo ""
|
||||
echo "This script needs to run with sudo because"
|
||||
echo "Docker creates files with root ownership."
|
||||
echo ""
|
||||
echo "Please run:"
|
||||
echo " sudo ./destroy_everything.sh"
|
||||
echo ""
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "============================================"
|
||||
echo " SneakyScan Fresh Start - DESTROY EVERYTHING"
|
||||
echo "============================================"
|
||||
echo ""
|
||||
echo "This will remove:"
|
||||
echo " - All database files in ./data/"
|
||||
echo " - All config files in ./configs/"
|
||||
echo " - All scan outputs in ./output/"
|
||||
echo ""
|
||||
read -p "Are you sure you want to continue? (yes/no): " -r
|
||||
echo ""
|
||||
|
||||
if [[ ! $REPLY =~ ^[Yy][Ee][Ss]$ ]]; then
|
||||
echo "Aborted."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "Starting cleanup..."
|
||||
echo ""
|
||||
|
||||
# Clean data directory (database files)
|
||||
if [ -d "data" ]; then
|
||||
echo "Cleaning data directory..."
|
||||
rm -rfv data/*
|
||||
echo " Data directory cleaned"
|
||||
else
|
||||
echo " <20> Data directory not found"
|
||||
fi
|
||||
|
||||
# Clean configs directory
|
||||
if [ -d "configs" ]; then
|
||||
echo "Cleaning configs directory..."
|
||||
rm -rfv configs/*
|
||||
echo " Configs directory cleaned"
|
||||
else
|
||||
echo " <20> Configs directory not found"
|
||||
fi
|
||||
|
||||
# Clean output directory (scan results)
|
||||
if [ -d "output" ]; then
|
||||
echo "Cleaning output directory..."
|
||||
rm -rfv output/*
|
||||
echo " Output directory cleaned"
|
||||
else
|
||||
echo " <20> Output directory not found"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "============================================"
|
||||
echo " Fresh start complete! All data removed."
|
||||
echo "============================================"
|
||||
@@ -1,13 +0,0 @@
|
||||
version: '3.8'
|
||||
|
||||
services:
|
||||
scanner:
|
||||
build: .
|
||||
image: sneakyscanner:latest
|
||||
container_name: sneakyscanner
|
||||
privileged: true # Required for masscan raw socket access
|
||||
network_mode: host # Required for network scanning
|
||||
volumes:
|
||||
- ./configs:/app/configs:ro
|
||||
- ./output:/app/output
|
||||
command: /app/configs/example-site.yaml
|
||||
@@ -5,9 +5,9 @@ services:
|
||||
build: .
|
||||
image: sneakyscanner:latest
|
||||
container_name: sneakyscanner-web
|
||||
# Override entrypoint to run Flask app instead of scanner
|
||||
entrypoint: ["python3", "-u"]
|
||||
command: ["-m", "web.app"]
|
||||
# Use entrypoint script that auto-initializes database on first run
|
||||
entrypoint: ["/docker-entrypoint.sh"]
|
||||
command: ["python3", "-u", "-m", "web.app"]
|
||||
# Note: Using host network mode for scanner capabilities, so no port mapping needed
|
||||
# The Flask app will be accessible at http://localhost:5000
|
||||
volumes:
|
||||
@@ -28,7 +28,10 @@ services:
|
||||
- FLASK_PORT=5000
|
||||
# Database configuration (SQLite in mounted volume for persistence)
|
||||
- DATABASE_URL=sqlite:////app/data/sneakyscanner.db
|
||||
# Initial password for first run (leave empty to auto-generate)
|
||||
- INITIAL_PASSWORD=${INITIAL_PASSWORD:-}
|
||||
# Security settings
|
||||
# IMPORTANT: Set these in .env file or the application will fail to start!
|
||||
- SECRET_KEY=${SECRET_KEY:-dev-secret-key-change-in-production}
|
||||
- SNEAKYSCANNER_ENCRYPTION_KEY=${SNEAKYSCANNER_ENCRYPTION_KEY:-}
|
||||
# Optional: CORS origins (comma-separated)
|
||||
@@ -38,6 +41,9 @@ services:
|
||||
# Scheduler configuration (APScheduler)
|
||||
- SCHEDULER_EXECUTORS=${SCHEDULER_EXECUTORS:-2}
|
||||
- SCHEDULER_JOB_DEFAULTS_MAX_INSTANCES=${SCHEDULER_JOB_DEFAULTS_MAX_INSTANCES:-3}
|
||||
# UDP scanning configuration
|
||||
- UDP_SCAN_ENABLED=${UDP_SCAN_ENABLED:-false}
|
||||
- UDP_PORTS=${UDP_PORTS:-53,67,68,69,123,161,500,514,1900}
|
||||
# Scanner functionality requires privileged mode and host network for masscan/nmap
|
||||
privileged: true
|
||||
network_mode: host
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -163,7 +163,7 @@ Machine-readable scan data with all discovered services, ports, and SSL/TLS info
|
||||
"title": "Scan Title",
|
||||
"scan_time": "2025-01-15T10:30:00Z",
|
||||
"scan_duration": 95.3,
|
||||
"config_file": "/app/configs/example-site.yaml",
|
||||
"config_id": "/app/configs/example-site.yaml",
|
||||
"sites": [
|
||||
{
|
||||
"name": "Site Name",
|
||||
|
||||
@@ -24,10 +24,10 @@ SneakyScanner is deployed as a Docker container running a Flask web application
|
||||
|
||||
**Architecture:**
|
||||
- **Web Application**: Flask app on port 5000 with modern web UI
|
||||
- **Database**: SQLite (persisted to volume)
|
||||
- **Database**: SQLite (persisted to volume) - stores all configurations, scan results, and settings
|
||||
- **Background Jobs**: APScheduler for async scan execution
|
||||
- **Scanner**: masscan, nmap, sslyze, Playwright
|
||||
- **Config Creator**: Web-based CIDR-to-YAML configuration builder
|
||||
- **Config Management**: Database-backed configuration system managed entirely via web UI
|
||||
- **Scheduling**: Cron-based scheduled scans with dashboard management
|
||||
|
||||
---
|
||||
@@ -74,6 +74,31 @@ docker compose version
|
||||
|
||||
For users who want to get started immediately with the web application:
|
||||
|
||||
**Option 1: Automated Setup (Recommended)**
|
||||
|
||||
```bash
|
||||
# 1. Clone the repository
|
||||
git clone <repository-url>
|
||||
cd SneakyScan
|
||||
|
||||
# 2. Run the setup script
|
||||
./setup.sh
|
||||
|
||||
# 3. Access the web interface
|
||||
# Open browser to: http://localhost:5000
|
||||
# Login with password from ./admin_password.txt or ./logs/admin_password.txt
|
||||
```
|
||||
|
||||
The setup script automatically:
|
||||
- Generates secure random keys (SECRET_KEY, ENCRYPTION_KEY)
|
||||
- Prompts for password or generates a random one
|
||||
- Creates required directories
|
||||
- Builds Docker image
|
||||
- Starts the application
|
||||
- Auto-initializes database on first run
|
||||
|
||||
**Option 2: Manual Setup**
|
||||
|
||||
```bash
|
||||
# 1. Clone the repository
|
||||
git clone <repository-url>
|
||||
@@ -82,18 +107,17 @@ cd SneakyScan
|
||||
# 2. Create environment file
|
||||
cp .env.example .env
|
||||
# Edit .env and set SECRET_KEY and SNEAKYSCANNER_ENCRYPTION_KEY
|
||||
# Optionally set INITIAL_PASSWORD (leave blank for auto-generation)
|
||||
nano .env
|
||||
|
||||
# 3. Build the Docker image
|
||||
docker compose build
|
||||
# 3. Build and start (database auto-initializes on first run)
|
||||
docker compose up --build -d
|
||||
|
||||
# 4. Initialize the database and set password
|
||||
docker compose run --rm init-db --password "YourSecurePassword"
|
||||
# 4. Check logs for auto-generated password (if not set in .env)
|
||||
docker compose logs web | grep "Password"
|
||||
# Or check: ./logs/admin_password.txt
|
||||
|
||||
# 5. Start the application
|
||||
docker compose up -d
|
||||
|
||||
# 6. Access the web interface
|
||||
# 5. Access the web interface
|
||||
# Open browser to: http://localhost:5000
|
||||
```
|
||||
|
||||
@@ -119,6 +143,13 @@ docker compose -f docker-compose-standalone.yml up
|
||||
|
||||
SneakyScanner is configured via environment variables. The recommended approach is to use a `.env` file.
|
||||
|
||||
|
||||
**UDP Port Scanning**
|
||||
|
||||
- UDP Port scanning is disabled by default.
|
||||
- You can turn it on via the .env variable.
|
||||
- By Default, UDP port scanning only scans the top 20 ports, for convenience I have included the NMAP top 100 UDP ports as well.
|
||||
|
||||
#### Creating Your .env File
|
||||
|
||||
```bash
|
||||
@@ -126,13 +157,17 @@ SneakyScanner is configured via environment variables. The recommended approach
|
||||
cp .env.example .env
|
||||
|
||||
# Generate secure keys
|
||||
# SECRET_KEY: Flask session secret (64-character hex string)
|
||||
python3 -c "import secrets; print('SECRET_KEY=' + secrets.token_hex(32))" >> .env
|
||||
|
||||
# SNEAKYSCANNER_ENCRYPTION_KEY: Fernet key for database encryption (32 url-safe base64 bytes)
|
||||
python3 -c "from cryptography.fernet import Fernet; print('SNEAKYSCANNER_ENCRYPTION_KEY=' + Fernet.generate_key().decode())" >> .env
|
||||
|
||||
# Edit other settings as needed
|
||||
nano .env
|
||||
```
|
||||
|
||||
|
||||
#### Key Configuration Options
|
||||
|
||||
| Variable | Description | Default | Required |
|
||||
@@ -142,6 +177,7 @@ nano .env
|
||||
| `SECRET_KEY` | Flask session secret (change in production!) | `dev-secret-key-change-in-production` | **Yes** |
|
||||
| `SNEAKYSCANNER_ENCRYPTION_KEY` | Encryption key for sensitive settings | (empty) | **Yes** |
|
||||
| `DATABASE_URL` | SQLite database path | `sqlite:////app/data/sneakyscanner.db` | Yes |
|
||||
| `INITIAL_PASSWORD` | Password for first-run initialization (leave empty to auto-generate) | (empty) | No |
|
||||
| `LOG_LEVEL` | Logging level (DEBUG, INFO, WARNING, ERROR) | `INFO` | No |
|
||||
| `SCHEDULER_EXECUTORS` | Number of concurrent scan threads | `2` | No |
|
||||
| `SCHEDULER_JOB_DEFAULTS_MAX_INSTANCES` | Max instances of same job | `3` | No |
|
||||
@@ -162,54 +198,30 @@ The application needs these directories (created automatically by Docker):
|
||||
|
||||
```bash
|
||||
# Verify directories exist
|
||||
ls -la configs/ data/ output/ logs/
|
||||
ls -la data/ output/ logs/
|
||||
|
||||
# If missing, create them
|
||||
mkdir -p configs data output logs
|
||||
mkdir -p data output logs
|
||||
```
|
||||
|
||||
### Step 2: Configure Scan Targets
|
||||
|
||||
You can create scan configurations in two ways:
|
||||
After starting the application, create scan configurations using the web UI:
|
||||
|
||||
**Option A: Using the Web UI (Recommended - Phase 4 Feature)**
|
||||
**Creating Configurations via Web UI**
|
||||
|
||||
1. Navigate to **Configs** in the web interface
|
||||
2. Click **"Create New Config"**
|
||||
3. Use the CIDR-based config creator for quick setup:
|
||||
3. Use the form-based config creator:
|
||||
- Enter site name
|
||||
- Enter CIDR range (e.g., `192.168.1.0/24`)
|
||||
- Select expected ports from dropdowns
|
||||
- Click **"Generate Config"**
|
||||
4. Or use the **YAML Editor** for advanced configurations
|
||||
5. Save and use immediately in scans or schedules
|
||||
- Select expected TCP/UDP ports from dropdowns
|
||||
- Optionally enable ping checks
|
||||
4. Click **"Save Configuration"**
|
||||
5. Configuration is saved to database and immediately available for scans and schedules
|
||||
|
||||
**Option B: Manual YAML Files**
|
||||
**Note**: All configurations are stored in the database, not as files. This provides better reliability, easier backup, and seamless management through the web interface.
|
||||
|
||||
Create YAML configuration files manually in the `configs/` directory:
|
||||
|
||||
```bash
|
||||
# Example configuration
|
||||
cat > configs/my-network.yaml <<EOF
|
||||
title: "My Network Infrastructure"
|
||||
sites:
|
||||
- name: "Web Servers"
|
||||
cidr: "192.168.1.0/24" # Scan entire subnet
|
||||
expected_ports:
|
||||
- port: 80
|
||||
protocol: tcp
|
||||
service: "http"
|
||||
- port: 443
|
||||
protocol: tcp
|
||||
service: "https"
|
||||
- port: 22
|
||||
protocol: tcp
|
||||
service: "ssh"
|
||||
ping_expected: true
|
||||
EOF
|
||||
```
|
||||
|
||||
**Note**: Phase 4 introduced a powerful config creator in the web UI that makes it easy to generate configs from CIDR ranges without manually editing YAML.
|
||||
|
||||
### Step 3: Build Docker Image
|
||||
|
||||
@@ -223,28 +235,56 @@ docker images | grep sneakyscanner
|
||||
|
||||
### Step 4: Initialize Database
|
||||
|
||||
The database must be initialized before first use. The init-db service uses a profile, so you need to explicitly run it:
|
||||
**Automatic Initialization (Recommended)**
|
||||
|
||||
As of Phase 5, the database is automatically initialized on first run:
|
||||
|
||||
```bash
|
||||
# Just start the application
|
||||
docker compose up -d
|
||||
|
||||
# On first run, the entrypoint script will:
|
||||
# - Detect no existing database
|
||||
# - Generate a random password (if INITIAL_PASSWORD not set in .env)
|
||||
# - Save password to ./logs/admin_password.txt
|
||||
# - Initialize database schema
|
||||
# - Create default settings and alert rules
|
||||
# - Start the Flask application
|
||||
|
||||
# Check logs to see the auto-generated password
|
||||
docker compose logs web | grep "Password"
|
||||
|
||||
# Or view the password file
|
||||
cat logs/admin_password.txt
|
||||
```
|
||||
|
||||
**Manual Initialization (Advanced)**
|
||||
|
||||
You can still manually initialize the database if needed:
|
||||
|
||||
```bash
|
||||
# Initialize database and set application password
|
||||
docker compose -f docker-compose.yml run --rm init-db --password "YourSecurePassword"
|
||||
docker compose run --rm init-db --password "YourSecurePassword" --force
|
||||
|
||||
# The init-db command will:
|
||||
# - Create database schema
|
||||
# - Run all Alembic migrations
|
||||
# - Set the application password (bcrypt hashed)
|
||||
# - Create default settings with encryption
|
||||
# - Create default alert rules
|
||||
|
||||
# Verify database was created
|
||||
ls -lh data/sneakyscanner.db
|
||||
```
|
||||
|
||||
**Password Requirements:**
|
||||
**Password Management:**
|
||||
- Leave `INITIAL_PASSWORD` blank in `.env` for auto-generation
|
||||
- Auto-generated passwords are saved to `./logs/admin_password.txt`
|
||||
- For custom password, set `INITIAL_PASSWORD` in `.env`
|
||||
- Minimum 8 characters recommended
|
||||
- Use a strong, unique password
|
||||
- Store securely (password manager)
|
||||
|
||||
**Note**: The init-db service is defined with `profiles: [tools]` in docker-compose.yml, which means it won't start automatically with `docker compose up`.
|
||||
**Note**: The init-db service is defined with `profiles: [tools]` in docker-compose.yml, which means it won't start automatically with `docker compose up`. However, the web service now handles initialization automatically via the entrypoint script.
|
||||
|
||||
### Step 5: Verify Configuration
|
||||
|
||||
@@ -333,38 +373,37 @@ The dashboard provides a central view of your scanning activity:
|
||||
- **Trend Charts**: Port count trends over time using Chart.js
|
||||
- **Quick Actions**: Buttons to run scans, create configs, manage schedules
|
||||
|
||||
### Managing Scan Configurations (Phase 4)
|
||||
### Managing Scan Configurations
|
||||
|
||||
All scan configurations are stored in the database and managed entirely through the web interface.
|
||||
|
||||
**Creating Configs:**
|
||||
1. Navigate to **Configs** → **Create New Config**
|
||||
2. **CIDR Creator Mode**:
|
||||
2. Fill in the configuration form:
|
||||
- Enter site name (e.g., "Production Servers")
|
||||
- Enter CIDR range (e.g., `192.168.1.0/24`)
|
||||
- Select expected TCP/UDP ports from dropdowns
|
||||
- Click **"Generate Config"** to create YAML
|
||||
3. **YAML Editor Mode**:
|
||||
- Switch to editor tab for advanced configurations
|
||||
- Syntax highlighting with line numbers
|
||||
- Validate YAML before saving
|
||||
- Enable/disable ping checks
|
||||
3. Click **"Save Configuration"**
|
||||
4. Configuration is immediately stored in database and available for use
|
||||
|
||||
**Editing Configs:**
|
||||
1. Navigate to **Configs** → Select config
|
||||
1. Navigate to **Configs** → Select config from list
|
||||
2. Click **"Edit"** button
|
||||
3. Make changes in YAML editor
|
||||
4. Save changes (validates YAML automatically)
|
||||
3. Modify any fields in the configuration form
|
||||
4. Click **"Save Changes"** to update database
|
||||
|
||||
**Uploading Configs:**
|
||||
1. Navigate to **Configs** → **Upload**
|
||||
2. Select YAML file from your computer
|
||||
3. File is validated and saved to `configs/` directory
|
||||
|
||||
**Downloading Configs:**
|
||||
- Click **"Download"** button next to any config
|
||||
- Saves YAML file to your local machine
|
||||
**Viewing Configs:**
|
||||
- Navigate to **Configs** page to see all saved configurations
|
||||
- Each config shows site name, CIDR range, and expected ports
|
||||
- Click on any config to view full details
|
||||
|
||||
**Deleting Configs:**
|
||||
- Click **"Delete"** button
|
||||
- Click **"Delete"** button next to any config
|
||||
- **Warning**: Cannot delete configs used by active schedules
|
||||
- Deletion removes the configuration from the database permanently
|
||||
|
||||
**Note**: All configurations are database-backed, providing automatic backups when you backup the database file.
|
||||
|
||||
### Running Scans
|
||||
|
||||
@@ -421,12 +460,11 @@ SneakyScanner uses several mounted volumes for data persistence:
|
||||
|
||||
| Volume | Container Path | Purpose | Important? |
|
||||
|--------|----------------|---------|------------|
|
||||
| `./configs` | `/app/configs` | Scan configuration files (managed via web UI) | Yes |
|
||||
| `./data` | `/app/data` | SQLite database (contains all scan history) | **Critical** |
|
||||
| `./data` | `/app/data` | SQLite database (contains configurations, scan history, settings) | **Critical** |
|
||||
| `./output` | `/app/output` | Scan results (JSON, HTML, ZIP, screenshots) | Yes |
|
||||
| `./logs` | `/app/logs` | Application logs (rotating file handler) | No |
|
||||
|
||||
**Note**: As of Phase 4, the `./configs` volume is read-write to support the web-based config creator and editor. The web UI can now create, edit, and delete configuration files directly.
|
||||
**Note**: All scan configurations are stored in the SQLite database (`./data/sneakyscanner.db`). There is no separate configs directory or YAML files. Backing up the database file ensures all your configurations are preserved.
|
||||
|
||||
### Backing Up Data
|
||||
|
||||
@@ -434,23 +472,22 @@ SneakyScanner uses several mounted volumes for data persistence:
|
||||
# Create backup directory
|
||||
mkdir -p backups/$(date +%Y%m%d)
|
||||
|
||||
# Backup database
|
||||
# Backup database (includes all configurations)
|
||||
cp data/sneakyscanner.db backups/$(date +%Y%m%d)/
|
||||
|
||||
# Backup scan outputs
|
||||
tar -czf backups/$(date +%Y%m%d)/output.tar.gz output/
|
||||
|
||||
# Backup configurations
|
||||
tar -czf backups/$(date +%Y%m%d)/configs.tar.gz configs/
|
||||
```
|
||||
|
||||
**Important**: The database backup includes all scan configurations, settings, schedules, and scan history. No separate configuration file backup is needed.
|
||||
|
||||
### Restoring Data
|
||||
|
||||
```bash
|
||||
# Stop application
|
||||
docker compose -f docker-compose.yml down
|
||||
|
||||
# Restore database
|
||||
# Restore database (includes all configurations)
|
||||
cp backups/YYYYMMDD/sneakyscanner.db data/
|
||||
|
||||
# Restore outputs
|
||||
@@ -460,6 +497,8 @@ tar -xzf backups/YYYYMMDD/output.tar.gz
|
||||
docker compose -f docker-compose.yml up -d
|
||||
```
|
||||
|
||||
**Note**: Restoring the database file restores all configurations, settings, schedules, and scan history.
|
||||
|
||||
### Cleaning Up Old Scan Results
|
||||
|
||||
**Option A: Using the Web UI (Recommended)**
|
||||
@@ -508,50 +547,52 @@ curl -X POST http://localhost:5000/api/auth/logout \
|
||||
-b cookies.txt
|
||||
```
|
||||
|
||||
### Config Management (Phase 4)
|
||||
### Config Management
|
||||
|
||||
```bash
|
||||
# List all configs
|
||||
curl http://localhost:5000/api/configs \
|
||||
-b cookies.txt
|
||||
|
||||
# Get specific config
|
||||
curl http://localhost:5000/api/configs/prod-network.yaml \
|
||||
# Get specific config by ID
|
||||
curl http://localhost:5000/api/configs/1 \
|
||||
-b cookies.txt
|
||||
|
||||
# Create new config
|
||||
curl -X POST http://localhost:5000/api/configs \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"filename": "test-network.yaml",
|
||||
"content": "title: Test Network\nsites:\n - name: Test\n cidr: 10.0.0.0/24"
|
||||
"name": "Test Network",
|
||||
"cidr": "10.0.0.0/24",
|
||||
"expected_ports": [
|
||||
{"port": 80, "protocol": "tcp", "service": "http"},
|
||||
{"port": 443, "protocol": "tcp", "service": "https"}
|
||||
],
|
||||
"ping_expected": true
|
||||
}' \
|
||||
-b cookies.txt
|
||||
|
||||
# Update config
|
||||
curl -X PUT http://localhost:5000/api/configs/test-network.yaml \
|
||||
curl -X PUT http://localhost:5000/api/configs/1 \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"content": "title: Updated Test Network\nsites:\n - name: Test Site\n cidr: 10.0.0.0/24"
|
||||
"name": "Updated Test Network",
|
||||
"cidr": "10.0.1.0/24"
|
||||
}' \
|
||||
-b cookies.txt
|
||||
|
||||
# Download config
|
||||
curl http://localhost:5000/api/configs/test-network.yaml/download \
|
||||
-b cookies.txt -o test-network.yaml
|
||||
|
||||
# Delete config
|
||||
curl -X DELETE http://localhost:5000/api/configs/test-network.yaml \
|
||||
curl -X DELETE http://localhost:5000/api/configs/1 \
|
||||
-b cookies.txt
|
||||
```
|
||||
|
||||
### Scan Management
|
||||
|
||||
```bash
|
||||
# Trigger a scan
|
||||
# Trigger a scan (using config ID from database)
|
||||
curl -X POST http://localhost:5000/api/scans \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"config_file": "/app/configs/prod-network.yaml"}' \
|
||||
-d '{"config_id": 1}' \
|
||||
-b cookies.txt
|
||||
|
||||
# List all scans
|
||||
@@ -578,12 +619,12 @@ curl -X DELETE http://localhost:5000/api/scans/123 \
|
||||
curl http://localhost:5000/api/schedules \
|
||||
-b cookies.txt
|
||||
|
||||
# Create schedule
|
||||
# Create schedule (using config ID from database)
|
||||
curl -X POST http://localhost:5000/api/schedules \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"name": "Daily Production Scan",
|
||||
"config_file": "/app/configs/prod-network.yaml",
|
||||
"config_id": 1,
|
||||
"cron_expression": "0 2 * * *",
|
||||
"enabled": true
|
||||
}' \
|
||||
@@ -699,17 +740,25 @@ tail -f logs/sneakyscanner.log
|
||||
|
||||
```bash
|
||||
# Check logs for errors
|
||||
docker compose -f docker-compose.yml logs web
|
||||
docker compose logs web
|
||||
|
||||
# Common issues:
|
||||
# 1. Database not initialized - run init-db first
|
||||
# 2. Permission issues with volumes - check directory ownership
|
||||
# 3. Port 5000 already in use - change FLASK_PORT or stop conflicting service
|
||||
# 1. Permission issues with volumes - check directory ownership
|
||||
# 2. Port 5000 already in use - change FLASK_PORT or stop conflicting service
|
||||
# 3. Database initialization failed - check logs for specific error
|
||||
|
||||
# Check if database initialization is stuck
|
||||
docker compose logs web | grep -A 20 "First Run Detected"
|
||||
|
||||
# If initialization failed, clean up and retry
|
||||
docker compose down
|
||||
rm -rf data/.db_initialized # Remove marker file
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
### Database Initialization Fails
|
||||
|
||||
**Problem**: `init_db.py` fails with errors
|
||||
**Problem**: Automatic database initialization fails on first run
|
||||
|
||||
```bash
|
||||
# Check database directory permissions
|
||||
@@ -718,12 +767,37 @@ ls -la data/
|
||||
# Fix permissions if needed
|
||||
sudo chown -R $USER:$USER data/
|
||||
|
||||
# Verify SQLite is accessible
|
||||
sqlite3 data/sneakyscanner.db "SELECT 1;" 2>&1
|
||||
# View initialization logs
|
||||
docker compose logs web | grep -A 50 "Initializing database"
|
||||
|
||||
# Remove corrupted database and reinitialize
|
||||
rm data/sneakyscanner.db
|
||||
docker compose -f docker-compose.yml run --rm init-db --password "YourPassword"
|
||||
# Clean up and retry initialization
|
||||
docker compose down
|
||||
rm -rf data/sneakyscanner.db data/.db_initialized
|
||||
docker compose up -d
|
||||
|
||||
# Or manually initialize with specific password
|
||||
docker compose down
|
||||
rm -rf data/sneakyscanner.db data/.db_initialized
|
||||
docker compose run --rm init-db --password "YourPassword" --force
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
**Can't Find Password File**
|
||||
|
||||
**Problem**: Password file not created or can't be found
|
||||
|
||||
```bash
|
||||
# Check both possible locations
|
||||
cat admin_password.txt # Created by setup.sh
|
||||
cat logs/admin_password.txt # Created by Docker entrypoint
|
||||
|
||||
# Check container logs for password
|
||||
docker compose logs web | grep -i password
|
||||
|
||||
# If password file is missing, manually set one
|
||||
docker compose down
|
||||
docker compose run --rm init-db --password "YourNewPassword" --force
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
### Scans Fail with "Permission Denied"
|
||||
@@ -786,24 +860,25 @@ docker compose -f docker-compose.yml logs web | grep -E "(ERROR|Exception|Traceb
|
||||
docker compose -f docker-compose.yml exec web which masscan nmap
|
||||
```
|
||||
|
||||
### Config Files Not Appearing in Web UI
|
||||
### Configs Not Appearing in Web UI
|
||||
|
||||
**Problem**: Manually created configs don't show up in web interface
|
||||
**Problem**: Created configs don't show up in web interface
|
||||
|
||||
```bash
|
||||
# Check file permissions (must be readable by web container)
|
||||
ls -la configs/
|
||||
# Check database connectivity
|
||||
docker compose -f docker-compose.yml logs web | grep -i "database"
|
||||
|
||||
# Fix permissions if needed
|
||||
sudo chown -R 1000:1000 configs/
|
||||
chmod 644 configs/*.yaml
|
||||
# Verify database file exists and is readable
|
||||
ls -lh data/sneakyscanner.db
|
||||
|
||||
# Verify YAML syntax is valid
|
||||
docker compose -f docker-compose.yml exec web python3 -c \
|
||||
"import yaml; yaml.safe_load(open('/app/configs/your-config.yaml'))"
|
||||
|
||||
# Check web logs for parsing errors
|
||||
# Check for errors when creating configs
|
||||
docker compose -f docker-compose.yml logs web | grep -i "config"
|
||||
|
||||
# Try accessing configs via API
|
||||
curl http://localhost:5000/api/configs -b cookies.txt
|
||||
|
||||
# If database is corrupted, check integrity
|
||||
docker compose -f docker-compose.yml exec web sqlite3 /app/data/sneakyscanner.db "PRAGMA integrity_check;"
|
||||
```
|
||||
|
||||
### Health Check Failing
|
||||
@@ -890,11 +965,11 @@ server {
|
||||
# Ensure proper ownership of data directories
|
||||
sudo chown -R $USER:$USER data/ output/ logs/
|
||||
|
||||
# Restrict database file permissions
|
||||
# Restrict database file permissions (contains configurations and sensitive data)
|
||||
chmod 600 data/sneakyscanner.db
|
||||
|
||||
# Configs should be read-only
|
||||
chmod 444 configs/*.yaml
|
||||
# Ensure database directory is writable
|
||||
chmod 700 data/
|
||||
```
|
||||
|
||||
---
|
||||
@@ -962,19 +1037,17 @@ mkdir -p "$BACKUP_DIR"
|
||||
# Stop application for consistent backup
|
||||
docker compose -f docker-compose.yml stop web
|
||||
|
||||
# Backup database
|
||||
# Backup database (includes all configurations)
|
||||
cp data/sneakyscanner.db "$BACKUP_DIR/"
|
||||
|
||||
# Backup outputs (last 30 days only)
|
||||
find output/ -type f -mtime -30 -exec cp --parents {} "$BACKUP_DIR/" \;
|
||||
|
||||
# Backup configs
|
||||
cp -r configs/ "$BACKUP_DIR/"
|
||||
|
||||
# Restart application
|
||||
docker compose -f docker-compose.yml start web
|
||||
|
||||
echo "Backup complete: $BACKUP_DIR"
|
||||
echo "Database backup includes all configurations, settings, and scan history"
|
||||
```
|
||||
|
||||
Make executable and schedule with cron:
|
||||
@@ -994,15 +1067,18 @@ crontab -e
|
||||
# Stop application
|
||||
docker compose -f docker-compose.yml down
|
||||
|
||||
# Restore files
|
||||
# Restore database (includes all configurations)
|
||||
cp backups/YYYYMMDD_HHMMSS/sneakyscanner.db data/
|
||||
cp -r backups/YYYYMMDD_HHMMSS/configs/* configs/
|
||||
|
||||
# Restore output files
|
||||
cp -r backups/YYYYMMDD_HHMMSS/output/* output/
|
||||
|
||||
# Start application
|
||||
docker compose -f docker-compose.yml up -d
|
||||
```
|
||||
|
||||
**Note**: Restoring the database file will restore all configurations, settings, schedules, and scan history from the backup.
|
||||
|
||||
---
|
||||
|
||||
## Support and Further Reading
|
||||
@@ -1016,13 +1092,13 @@ docker compose -f docker-compose.yml up -d
|
||||
|
||||
## What's New
|
||||
|
||||
### Phase 4 (2025-11-17) - Config Creator ✅
|
||||
- **CIDR-based Config Creator**: Web UI for generating scan configs from CIDR ranges
|
||||
- **YAML Editor**: Built-in editor with syntax highlighting (CodeMirror)
|
||||
- **Config Management UI**: List, view, edit, download, and delete configs via web interface
|
||||
- **Config Upload**: Direct YAML file upload for advanced users
|
||||
- **REST API**: 7 new config management endpoints
|
||||
### Phase 4+ (2025-11-17) - Database-Backed Configuration System ✅
|
||||
- **Database-Backed Configs**: All configurations stored in SQLite database (no YAML files)
|
||||
- **Web-Based Config Creator**: Form-based UI for creating scan configs from CIDR ranges
|
||||
- **Config Management UI**: List, view, edit, and delete configs via web interface
|
||||
- **REST API**: Full config management via RESTful API with database storage
|
||||
- **Schedule Protection**: Prevents deleting configs used by active schedules
|
||||
- **Automatic Backups**: Configurations included in database backups
|
||||
|
||||
### Phase 3 (2025-11-14) - Dashboard & Scheduling ✅
|
||||
- **Dashboard**: Summary stats, recent scans, trend charts
|
||||
@@ -1044,5 +1120,5 @@ docker compose -f docker-compose.yml up -d
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-11-17
|
||||
**Version**: Phase 4 - Config Creator Complete
|
||||
**Last Updated**: 2025-11-24
|
||||
**Version**: Phase 4+ - Database-Backed Configuration System
|
||||
|
||||
0
docs/KNOWN_ISSUES.md
Normal file
0
docs/KNOWN_ISSUES.md
Normal file
953
docs/ROADMAP.md
953
docs/ROADMAP.md
@@ -4,930 +4,115 @@
|
||||
|
||||
SneakyScanner is a comprehensive **Flask web application** for infrastructure monitoring and security auditing. The primary interface is the web GUI, with a CLI API client planned for scripting and automation needs.
|
||||
|
||||
**Status:** Phase 4 Complete ✅ | Phase 5 Next Up
|
||||
## Version 1.0.0 - Complete ✅
|
||||
|
||||
## Progress Overview
|
||||
- ✅ **Phase 1: Foundation** - Complete (2025-11-13)
|
||||
- Database schema, SQLAlchemy models, settings system, Flask app structure
|
||||
- ✅ **Phase 2: Flask Web App Core** - Complete (2025-11-14)
|
||||
- REST API, background jobs, authentication, web UI, testing (100 tests)
|
||||
- ✅ **Phase 3: Dashboard & Scheduling** - Complete (2025-11-14)
|
||||
- Dashboard, scan history, scheduled scans, trend charts
|
||||
- ✅ **Phase 4: Config Creator** - Complete (2025-11-17)
|
||||
- CIDR-based config creation, YAML editor, config management UI
|
||||
- 📋 **Phase 5: Email, Webhooks & Comparisons** - Next Up
|
||||
- Email notifications, alert rules, scan comparison
|
||||
- 📋 **Phase 6: CLI as API Client** - Planned
|
||||
- CLI for scripting and automation via API
|
||||
- 📋 **Phase 7: Advanced Features** - Future
|
||||
- CVE integration, timeline view, PDF export, enhanced reports
|
||||
|
||||
|
||||
|
||||
**Core Features:**
|
||||
- **Centralized dashboard** for viewing scan history and trends
|
||||
- **Scheduled scanning** for continuous infrastructure monitoring
|
||||
- **Email notifications** for critical changes and certificate expirations (coming soon)
|
||||
- **Historical analysis** with charts, graphs, and comparison reports
|
||||
- **Config creator** for easy CIDR-based scan configuration
|
||||
- **RESTful API** for integration and automation
|
||||
- **Simple deployment** using SQLite3 (single-user, self-hosted)
|
||||
|
||||
**Planned:**
|
||||
- **CLI API client** for scripting and automation workflows (Phase 6)
|
||||
|
||||
## Target Users
|
||||
|
||||
- **Infrastructure teams** monitoring on-premises networks
|
||||
- **Security professionals** performing periodic security audits
|
||||
- **DevOps engineers** tracking infrastructure drift
|
||||
- **Single users or small teams** (not enterprise multi-tenant)
|
||||
|
||||
## Technology Stack
|
||||
|
||||
### Backend
|
||||
- **Flask** 3.x - Lightweight Python web framework
|
||||
- **SQLAlchemy** 2.x - ORM for database abstraction
|
||||
- **SQLite3** - Embedded database (easy deployment, sufficient for single-user)
|
||||
- **APScheduler** 3.x - Background job scheduling for periodic scans
|
||||
- **Flask-Login** - Simple session-based authentication
|
||||
- **Flask-CORS** - API access control (optional for CLI API client)
|
||||
- **Marshmallow** - API serialization/deserialization
|
||||
|
||||
### Frontend
|
||||
- **Jinja2** - Server-side templating (already in use)
|
||||
- **Bootstrap 5** - Responsive UI framework with dark theme support
|
||||
- **Chart.js** 4.x - Lightweight charting library for trends
|
||||
- **DataTables.js** - Interactive sortable/filterable tables (Phase 6)
|
||||
- **Vanilla JavaScript** - Keep dependencies minimal
|
||||
|
||||
### Infrastructure
|
||||
- **Docker Compose** - Multi-container orchestration (Flask app + Scanner)
|
||||
- **Gunicorn** - WSGI server for production
|
||||
- **Nginx** - Reverse proxy (optional, for production deployments)
|
||||
|
||||
### Existing Components (Keep)
|
||||
- **Masscan** - Fast port discovery
|
||||
- **Nmap** - Service detection
|
||||
- **Playwright** - Screenshot capture
|
||||
- **sslyze** - SSL/TLS analysis
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ Flask Web Application │
|
||||
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────────┐ │
|
||||
│ │ Web UI │ │ REST API │ │ Scheduler │ │
|
||||
│ │ (Dashboard) │ │ (JSON/CRUD) │ │ (APScheduler) │ │
|
||||
│ └──────┬───────┘ └──────┬───────┘ └────────┬─────────┘ │
|
||||
│ │ │ │ │
|
||||
│ └─────────────────┴────────────────────┘ │
|
||||
│ │ │
|
||||
│ ┌────────▼────────┐ │
|
||||
│ │ SQLAlchemy │ │
|
||||
│ │ (ORM Layer) │ │
|
||||
│ └────────┬────────┘ │
|
||||
│ │ │
|
||||
│ ┌────────▼────────┐ │
|
||||
│ │ SQLite3 DB │ │
|
||||
│ │ (scan history) │ │
|
||||
│ └─────────────────┘ │
|
||||
└───────────────────────────┬─────────────────────────────────┘
|
||||
│
|
||||
┌──────────▼──────────┐
|
||||
│ Scanner Engine │
|
||||
│ (scanner.py) │
|
||||
│ ┌────────────────┐ │
|
||||
│ │ Masscan/Nmap │ │
|
||||
│ │ Playwright │ │
|
||||
│ │ sslyze │ │
|
||||
│ └────────────────┘ │
|
||||
└─────────────────────┘
|
||||
│
|
||||
┌──────────▼──────────┐
|
||||
│ CLI API Client │
|
||||
│ (optional future) │
|
||||
└─────────────────────┘
|
||||
```
|
||||
|
||||
### Component Interaction
|
||||
|
||||
1. **Web UI** - User interacts with dashboard, triggers scans, views history
|
||||
2. **REST API** - Handles requests from web UI and CLI client
|
||||
3. **Scheduler (APScheduler)** - Triggers scans based on cron-like schedules
|
||||
4. **SQLAlchemy ORM** - Abstracts database operations
|
||||
5. **SQLite3 Database** - Stores scan results, schedules, settings, alerts
|
||||
6. **Scanner Engine** - Performs actual network scanning (masscan, nmap, etc.)
|
||||
7. **CLI API Client** - Future: thin client that calls Flask API
|
||||
|
||||
## Database Schema Design
|
||||
|
||||
### Core Tables
|
||||
|
||||
#### `scans`
|
||||
Stores metadata about each scan execution.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique scan ID |
|
||||
| `timestamp` | DATETIME | Scan start time (UTC) |
|
||||
| `duration` | FLOAT | Total scan duration (seconds) |
|
||||
| `status` | VARCHAR(20) | `running`, `completed`, `failed` |
|
||||
| `config_file` | TEXT | Path to YAML config used |
|
||||
| `title` | TEXT | Scan title from config |
|
||||
| `json_path` | TEXT | Path to JSON report |
|
||||
| `html_path` | TEXT | Path to HTML report |
|
||||
| `zip_path` | TEXT | Path to ZIP archive |
|
||||
| `screenshot_dir` | TEXT | Path to screenshot directory |
|
||||
| `created_at` | DATETIME | Record creation time |
|
||||
| `triggered_by` | VARCHAR(50) | `manual`, `scheduled`, `api` |
|
||||
| `schedule_id` | INTEGER | FK to schedules (if triggered by schedule) |
|
||||
|
||||
#### `scan_sites`
|
||||
Logical grouping of IPs by site.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique site record ID |
|
||||
| `scan_id` | INTEGER | FK to scans |
|
||||
| `site_name` | VARCHAR(255) | Site name from config |
|
||||
|
||||
#### `scan_ips`
|
||||
IP addresses scanned in each scan.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique IP record ID |
|
||||
| `scan_id` | INTEGER | FK to scans |
|
||||
| `site_id` | INTEGER | FK to scan_sites |
|
||||
| `ip_address` | VARCHAR(45) | IPv4 or IPv6 address |
|
||||
| `ping_expected` | BOOLEAN | Expected ping response |
|
||||
| `ping_actual` | BOOLEAN | Actual ping response |
|
||||
|
||||
#### `scan_ports`
|
||||
Discovered TCP/UDP ports.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique port record ID |
|
||||
| `scan_id` | INTEGER | FK to scans |
|
||||
| `ip_id` | INTEGER | FK to scan_ips |
|
||||
| `port` | INTEGER | Port number (1-65535) |
|
||||
| `protocol` | VARCHAR(10) | `tcp` or `udp` |
|
||||
| `expected` | BOOLEAN | Was this port expected? |
|
||||
| `state` | VARCHAR(20) | `open`, `closed`, `filtered` |
|
||||
|
||||
#### `scan_services`
|
||||
Detected services on open ports.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique service record ID |
|
||||
| `scan_id` | INTEGER | FK to scans |
|
||||
| `port_id` | INTEGER | FK to scan_ports |
|
||||
| `service_name` | VARCHAR(100) | Service name (e.g., `ssh`, `http`) |
|
||||
| `product` | VARCHAR(255) | Product name (e.g., `OpenSSH`) |
|
||||
| `version` | VARCHAR(100) | Version string |
|
||||
| `extrainfo` | TEXT | Additional nmap info |
|
||||
| `ostype` | VARCHAR(100) | OS type if detected |
|
||||
| `http_protocol` | VARCHAR(10) | `http` or `https` (if web service) |
|
||||
| `screenshot_path` | TEXT | Relative path to screenshot |
|
||||
|
||||
#### `scan_certificates`
|
||||
SSL/TLS certificates discovered on HTTPS services.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique certificate record ID |
|
||||
| `scan_id` | INTEGER | FK to scans |
|
||||
| `service_id` | INTEGER | FK to scan_services |
|
||||
| `subject` | TEXT | Certificate subject (CN) |
|
||||
| `issuer` | TEXT | Certificate issuer |
|
||||
| `serial_number` | TEXT | Serial number |
|
||||
| `not_valid_before` | DATETIME | Validity start date |
|
||||
| `not_valid_after` | DATETIME | Validity end date |
|
||||
| `days_until_expiry` | INTEGER | Days until expiration |
|
||||
| `sans` | TEXT | JSON array of SANs |
|
||||
| `is_self_signed` | BOOLEAN | Self-signed certificate flag |
|
||||
|
||||
#### `scan_tls_versions`
|
||||
TLS version support and cipher suites.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique TLS version record ID |
|
||||
| `scan_id` | INTEGER | FK to scans |
|
||||
| `certificate_id` | INTEGER | FK to scan_certificates |
|
||||
| `tls_version` | VARCHAR(20) | `TLS 1.0`, `TLS 1.1`, `TLS 1.2`, `TLS 1.3` |
|
||||
| `supported` | BOOLEAN | Is this version supported? |
|
||||
| `cipher_suites` | TEXT | JSON array of cipher suites |
|
||||
|
||||
### Scheduling & Notifications Tables
|
||||
|
||||
#### `schedules`
|
||||
Scheduled scan configurations.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique schedule ID |
|
||||
| `name` | VARCHAR(255) | Schedule name (e.g., "Daily prod scan") |
|
||||
| `config_file` | TEXT | Path to YAML config |
|
||||
| `cron_expression` | VARCHAR(100) | Cron-like schedule (e.g., `0 2 * * *`) |
|
||||
| `enabled` | BOOLEAN | Is schedule active? |
|
||||
| `last_run` | DATETIME | Last execution time |
|
||||
| `next_run` | DATETIME | Next scheduled execution |
|
||||
| `created_at` | DATETIME | Schedule creation time |
|
||||
| `updated_at` | DATETIME | Last modification time |
|
||||
|
||||
#### `alerts`
|
||||
Alert history and notifications sent.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique alert ID |
|
||||
| `scan_id` | INTEGER | FK to scans |
|
||||
| `alert_type` | VARCHAR(50) | `new_port`, `cert_expiry`, `service_change`, `ping_failed` |
|
||||
| `severity` | VARCHAR(20) | `info`, `warning`, `critical` |
|
||||
| `message` | TEXT | Human-readable alert message |
|
||||
| `ip_address` | VARCHAR(45) | Related IP (optional) |
|
||||
| `port` | INTEGER | Related port (optional) |
|
||||
| `email_sent` | BOOLEAN | Was email notification sent? |
|
||||
| `email_sent_at` | DATETIME | Email send timestamp |
|
||||
| `created_at` | DATETIME | Alert creation time |
|
||||
|
||||
#### `alert_rules`
|
||||
User-defined alert rules.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique rule ID |
|
||||
| `rule_type` | VARCHAR(50) | `unexpected_port`, `cert_expiry`, `service_down`, etc. |
|
||||
| `enabled` | BOOLEAN | Is rule active? |
|
||||
| `threshold` | INTEGER | Threshold value (e.g., days for cert expiry) |
|
||||
| `email_enabled` | BOOLEAN | Send email for this rule? |
|
||||
| `created_at` | DATETIME | Rule creation time |
|
||||
|
||||
### Settings Table
|
||||
|
||||
#### `settings`
|
||||
Application configuration key-value store.
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | INTEGER PRIMARY KEY | Unique setting ID |
|
||||
| `key` | VARCHAR(255) UNIQUE | Setting key (e.g., `smtp_server`) |
|
||||
| `value` | TEXT | Setting value (JSON for complex values) |
|
||||
| `updated_at` | DATETIME | Last modification time |
|
||||
|
||||
**Example Settings:**
|
||||
- `smtp_server` - SMTP server hostname
|
||||
- `smtp_port` - SMTP port (587, 465, 25)
|
||||
- `smtp_username` - SMTP username
|
||||
- `smtp_password` - SMTP password (encrypted)
|
||||
- `smtp_from_email` - From email address
|
||||
- `smtp_to_emails` - JSON array of recipient emails
|
||||
- `app_password` - Single-user password hash (bcrypt)
|
||||
- `retention_days` - How long to keep old scans (0 = forever)
|
||||
|
||||
## API Design
|
||||
|
||||
### REST API Endpoints
|
||||
|
||||
All API endpoints return JSON and follow RESTful conventions.
|
||||
|
||||
#### Scans
|
||||
|
||||
| Method | Endpoint | Description | Request Body | Response |
|
||||
|--------|----------|-------------|--------------|----------|
|
||||
| `GET` | `/api/scans` | List all scans (paginated) | - | `{ "scans": [...], "total": N, "page": 1 }` |
|
||||
| `GET` | `/api/scans/{id}` | Get scan details | - | `{ "scan": {...} }` |
|
||||
| `POST` | `/api/scans` | Trigger new scan | `{ "config_file": "path" }` | `{ "scan_id": N, "status": "running" }` |
|
||||
| `DELETE` | `/api/scans/{id}` | Delete scan and files | - | `{ "status": "deleted" }` |
|
||||
| `GET` | `/api/scans/{id}/status` | Get scan status | - | `{ "status": "running", "progress": "45%" }` |
|
||||
| `GET` | `/api/scans/{id1}/compare/{id2}` | Compare two scans | - | `{ "diff": {...} }` |
|
||||
|
||||
#### Schedules
|
||||
|
||||
| Method | Endpoint | Description | Request Body | Response |
|
||||
|--------|----------|-------------|--------------|----------|
|
||||
| `GET` | `/api/schedules` | List all schedules | - | `{ "schedules": [...] }` |
|
||||
| `GET` | `/api/schedules/{id}` | Get schedule details | - | `{ "schedule": {...} }` |
|
||||
| `POST` | `/api/schedules` | Create new schedule | `{ "name": "...", "config_file": "...", "cron_expression": "..." }` | `{ "schedule_id": N }` |
|
||||
| `PUT` | `/api/schedules/{id}` | Update schedule | `{ "enabled": true, "cron_expression": "..." }` | `{ "status": "updated" }` |
|
||||
| `DELETE` | `/api/schedules/{id}` | Delete schedule | - | `{ "status": "deleted" }` |
|
||||
| `POST` | `/api/schedules/{id}/trigger` | Manually trigger scheduled scan | - | `{ "scan_id": N }` |
|
||||
|
||||
#### Alerts
|
||||
|
||||
| Method | Endpoint | Description | Request Body | Response |
|
||||
|--------|----------|-------------|--------------|----------|
|
||||
| `GET` | `/api/alerts` | List recent alerts | - | `{ "alerts": [...] }` |
|
||||
| `GET` | `/api/alerts/rules` | List alert rules | - | `{ "rules": [...] }` |
|
||||
| `POST` | `/api/alerts/rules` | Create alert rule | `{ "rule_type": "...", "threshold": N }` | `{ "rule_id": N }` |
|
||||
| `PUT` | `/api/alerts/rules/{id}` | Update alert rule | `{ "enabled": false }` | `{ "status": "updated" }` |
|
||||
| `DELETE` | `/api/alerts/rules/{id}` | Delete alert rule | - | `{ "status": "deleted" }` |
|
||||
|
||||
#### Settings
|
||||
|
||||
| Method | Endpoint | Description | Request Body | Response |
|
||||
|--------|----------|-------------|--------------|----------|
|
||||
| `GET` | `/api/settings` | Get all settings (sanitized) | - | `{ "settings": {...} }` |
|
||||
| `PUT` | `/api/settings` | Update settings | `{ "smtp_server": "...", ... }` | `{ "status": "updated" }` |
|
||||
| `POST` | `/api/settings/test-email` | Test email configuration | - | `{ "status": "sent" }` |
|
||||
|
||||
#### Statistics & Trends
|
||||
|
||||
| Method | Endpoint | Description | Request Body | Response |
|
||||
|--------|----------|-------------|--------------|----------|
|
||||
| `GET` | `/api/stats/summary` | Dashboard summary stats | - | `{ "total_scans": N, "last_scan": "...", ... }` |
|
||||
| `GET` | `/api/stats/trends` | Trend data for charts | `?days=30&metric=port_count` | `{ "data": [...] }` |
|
||||
| `GET` | `/api/stats/certificates` | Certificate expiry overview | - | `{ "expiring_soon": [...], "expired": [...] }` |
|
||||
|
||||
### Authentication
|
||||
|
||||
**Phase 2-3:** Simple session-based authentication (single-user)
|
||||
- Login endpoint: `POST /api/auth/login` (username/password)
|
||||
- Logout endpoint: `POST /api/auth/logout`
|
||||
- Session cookies with Flask-Login
|
||||
- Password stored as bcrypt hash in settings table
|
||||
|
||||
**Phase 5:** API token authentication for CLI client
|
||||
- Generate API token: `POST /api/auth/token`
|
||||
- Revoke token: `DELETE /api/auth/token`
|
||||
- CLI sends token in `Authorization: Bearer <token>` header
|
||||
|
||||
## Phased Roadmap
|
||||
|
||||
### Phase 1: Foundation ✅ COMPLETE
|
||||
### Phase 1: Foundation ✅
|
||||
**Completed:** 2025-11-13
|
||||
|
||||
**Deliverables:**
|
||||
- SQLite database with 11 tables (scans, sites, IPs, ports, services, certificates, TLS versions, schedules, alerts, alert_rules, settings)
|
||||
- SQLAlchemy ORM models with relationships
|
||||
- Alembic migration system
|
||||
- Settings system with encryption (bcrypt for passwords, Fernet for sensitive data)
|
||||
- Database schema with 11 tables (SQLAlchemy ORM, Alembic migrations)
|
||||
- Settings system with encryption (bcrypt, Fernet)
|
||||
- Flask app structure with API blueprints
|
||||
- Docker Compose deployment configuration
|
||||
- Validation script for verification
|
||||
- Docker Compose deployment
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: Flask Web App Core ✅ COMPLETE
|
||||
### Phase 2: Flask Web App Core ✅
|
||||
**Completed:** 2025-11-14
|
||||
|
||||
**Deliverables:**
|
||||
- REST API with 8 endpoints (scans: trigger, list, get, status, delete; settings: get, update, test-email)
|
||||
- Background job queue using APScheduler (up to 3 concurrent scans)
|
||||
- Session-based authentication with Flask-Login
|
||||
- Database integration for scan results (full normalized schema population)
|
||||
- Web UI templates (dashboard, scan list/detail, login, error pages)
|
||||
- Error handling with content negotiation (JSON/HTML) and request IDs
|
||||
- Logging system with rotating file handlers
|
||||
- Production Docker Compose deployment
|
||||
- Comprehensive test suite (100 tests, all passing)
|
||||
- Documentation (API_REFERENCE.md, DEPLOYMENT.md)
|
||||
- REST API (8 endpoints for scans, settings)
|
||||
- Background job queue (APScheduler, 3 concurrent scans)
|
||||
- Session-based authentication (Flask-Login)
|
||||
- Web UI templates (dashboard, scan list/detail, login)
|
||||
- Comprehensive test suite (100 tests)
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: Dashboard & Scheduling ✅ COMPLETE
|
||||
### Phase 3: Dashboard & Scheduling ✅
|
||||
**Completed:** 2025-11-14
|
||||
|
||||
**Deliverables:**
|
||||
- Dashboard with summary stats (total scans, IPs, ports, services)
|
||||
- Recent scans table with clickable details
|
||||
- Scan detail page with full results display
|
||||
- Historical trend charts using Chart.js (port counts over time)
|
||||
- Scheduled scan management UI (create, edit, delete, enable/disable)
|
||||
- Schedule execution with APScheduler and cron expressions
|
||||
- Manual scan trigger from web UI
|
||||
- Navigation menu (Dashboard, Scans, Schedules, Configs, Settings)
|
||||
- Download buttons for scan reports (JSON, HTML, ZIP)
|
||||
- Dashboard with summary stats and trend charts (Chart.js)
|
||||
- Scan detail pages with full results display
|
||||
- Scheduled scan management (cron expressions)
|
||||
- Download buttons for reports (JSON, HTML, ZIP)
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Config Creator ✅ COMPLETE
|
||||
### Phase 4: Config Creator ✅
|
||||
**Completed:** 2025-11-17
|
||||
|
||||
**Deliverables:**
|
||||
- CIDR-based config creation UI (simplified workflow for quick config generation)
|
||||
- YAML editor with CodeMirror (syntax highlighting, line numbers)
|
||||
- Config management UI (list, view, edit, download, delete)
|
||||
- Direct YAML upload for advanced users
|
||||
- REST API for config operations (7 endpoints: list, get, create, update, delete, upload, download)
|
||||
- Schedule dependency protection (prevents deleting configs used by schedules)
|
||||
- Comprehensive testing (25+ unit and integration tests)
|
||||
- CIDR-based config creation UI
|
||||
- YAML editor with CodeMirror
|
||||
- Config management (list, view, edit, download, delete)
|
||||
- REST API for config operations (7 endpoints)
|
||||
|
||||
### Phase 5: Webhooks & Alerting ✅
|
||||
**Completed:** 2025-11-19
|
||||
|
||||
- Alert Rule Engine (9 alert types: unexpected_port, cert_expiry, ping_failed, etc.)
|
||||
- Webhook notifications with retry logic
|
||||
- Multiple webhook URLs with independent filtering
|
||||
- Notification templates (Slack, Discord, PagerDuty support)
|
||||
- Alert deduplication
|
||||
|
||||
---
|
||||
|
||||
### Phase 5: Email, Webhooks & Comparisons
|
||||
**Status:** Next Up
|
||||
**Priority:** MEDIUM
|
||||
## Planned Features
|
||||
|
||||
**Goals:**
|
||||
- Implement email notification system for infrastructure misconfigurations
|
||||
- Implement webhook notification system for real-time alerting
|
||||
- Create scan comparison reports to detect drift
|
||||
- Add alert rule configuration for unexpected exposure detection
|
||||
### Version 1.1.0 - Communication & Automation
|
||||
|
||||
**Core Use Case:**
|
||||
Monitor infrastructure for misconfigurations that expose unexpected ports/services to the world. When a scan detects an open port that wasn't defined in the YAML config's `expected_ports` list, trigger immediate notifications via email and/or webhooks.
|
||||
#### CLI as API Client
|
||||
- CLI tool for scripting and automation via REST API
|
||||
- API token authentication (Bearer tokens)
|
||||
- Commands for scan management, schedules, alerts
|
||||
|
||||
**Planned Features:**
|
||||
#### Email Notifications
|
||||
- SMTP integration with Flask-Mail
|
||||
- Jinja2 email templates (HTML + plain text)
|
||||
- Configurable recipients and rate limiting
|
||||
|
||||
#### 1. Alert Rule Engine
|
||||
**Purpose:** Automatically detect and classify infrastructure anomalies after each scan.
|
||||
#### Site CSV Export/Import
|
||||
- Bulk site management via CSV files
|
||||
|
||||
**Alert Types:**
|
||||
- `unexpected_port` - Port open but not in config's `expected_ports` list
|
||||
- `unexpected_service` - Service detected that doesn't match expected service name
|
||||
- `cert_expiry` - SSL/TLS certificate expiring soon (configurable threshold)
|
||||
- `ping_failed` - Expected host not responding to ping
|
||||
- `service_down` - Previously detected service no longer responding
|
||||
- `service_change` - Service version/product changed between scans
|
||||
- `weak_tls` - TLS 1.0/1.1 detected or weak cipher suites
|
||||
- `new_host` - New IP address responding in CIDR range
|
||||
- `host_disappeared` - Previously seen IP no longer responding
|
||||
---
|
||||
|
||||
**Alert Severity Levels:**
|
||||
- `critical` - Unexpected internet-facing service (ports 80/443/22/3389/etc.)
|
||||
- `warning` - Minor configuration drift or upcoming cert expiry
|
||||
- `info` - Informational alerts (new host discovered, service version change)
|
||||
### Version 1.2.0 - Reporting & Analysis
|
||||
|
||||
**Alert Rule Configuration:**
|
||||
```yaml
|
||||
# Example alert rule configuration (stored in DB)
|
||||
alert_rules:
|
||||
- id: 1
|
||||
rule_type: unexpected_port
|
||||
enabled: true
|
||||
severity: critical
|
||||
email_enabled: true
|
||||
webhook_enabled: true
|
||||
filter_conditions:
|
||||
ports: [22, 80, 443, 3389, 3306, 5432, 27017] # High-risk ports
|
||||
|
||||
- id: 2
|
||||
rule_type: cert_expiry
|
||||
enabled: true
|
||||
severity: warning
|
||||
threshold: 30 # Days before expiry
|
||||
email_enabled: true
|
||||
webhook_enabled: false
|
||||
```
|
||||
|
||||
**Implementation:**
|
||||
- Evaluate alert rules after each scan completes
|
||||
- Compare current scan results to expected configuration
|
||||
- Generate alerts and store in `alerts` table
|
||||
- Trigger notifications based on rule configuration
|
||||
- Alert deduplication (don't spam for same issue)
|
||||
|
||||
#### 2. Email Notifications
|
||||
**Purpose:** Send detailed email alerts when infrastructure misconfigurations are detected.
|
||||
|
||||
**SMTP Configuration (via Settings API):**
|
||||
```json
|
||||
{
|
||||
"smtp_server": "smtp.gmail.com",
|
||||
"smtp_port": 587,
|
||||
"smtp_use_tls": true,
|
||||
"smtp_username": "alerts@example.com",
|
||||
"smtp_password": "encrypted_password",
|
||||
"smtp_from_email": "SneakyScanner <alerts@example.com>",
|
||||
"smtp_to_emails": ["admin@example.com", "security@example.com"],
|
||||
"email_alerts_enabled": true
|
||||
}
|
||||
```
|
||||
|
||||
**Email Template (Jinja2 HTML):**
|
||||
```html
|
||||
Subject: [SneakyScanner Alert] Unexpected Port Detected - {{ ip_address }}:{{ port }}
|
||||
|
||||
Body:
|
||||
===============================================
|
||||
SneakyScanner Security Alert
|
||||
===============================================
|
||||
|
||||
Alert Type: {{ alert_type }}
|
||||
Severity: {{ severity }}
|
||||
Scan ID: {{ scan_id }}
|
||||
Timestamp: {{ timestamp }}
|
||||
|
||||
Issue Detected:
|
||||
{{ message }}
|
||||
|
||||
Details:
|
||||
- IP Address: {{ ip_address }}
|
||||
- Port: {{ port }}/{{ protocol }}
|
||||
- Service: {{ service_name }} ({{ product }} {{ version }})
|
||||
- Expected: No (not in expected_ports list)
|
||||
|
||||
Recommended Actions:
|
||||
1. Verify if this service should be exposed
|
||||
2. Check firewall rules for {{ ip_address }}
|
||||
3. Review service configuration
|
||||
4. Update scan config if this is intentional
|
||||
|
||||
View Full Scan Results:
|
||||
{{ web_url }}/scans/{{ scan_id }}
|
||||
|
||||
===============================================
|
||||
```
|
||||
|
||||
**Email Features:**
|
||||
- HTML email with styled alert box (Bootstrap-based)
|
||||
- Plain-text fallback for compatibility
|
||||
- Alert summary with actionable recommendations
|
||||
- Direct link to scan detail page
|
||||
- Configurable recipients (multiple emails)
|
||||
- Test email functionality in Settings UI
|
||||
- Email delivery tracking (email_sent, email_sent_at in alerts table)
|
||||
- Rate limiting to prevent email flood
|
||||
|
||||
**Email API Endpoints:**
|
||||
- `POST /api/settings/email` - Configure SMTP settings
|
||||
- `POST /api/settings/email/test` - Send test email
|
||||
- `GET /api/alerts?email_sent=true` - Get alerts with email status
|
||||
|
||||
#### 3. Webhook Notifications
|
||||
**Purpose:** Real-time HTTP POST notifications for integration with external systems (Slack, PagerDuty, custom dashboards, SIEM tools).
|
||||
|
||||
**Webhook Configuration (via Settings API):**
|
||||
```json
|
||||
{
|
||||
"webhook_enabled": true,
|
||||
"webhook_urls": [
|
||||
{
|
||||
"id": 1,
|
||||
"name": "Slack Security Channel",
|
||||
"url": "https://hooks.slack.com/services/XXX/YYY/ZZZ",
|
||||
"enabled": true,
|
||||
"auth_type": "none",
|
||||
"custom_headers": {},
|
||||
"alert_types": ["unexpected_port", "unexpected_service", "weak_tls"],
|
||||
"severity_filter": ["critical", "warning"]
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"name": "PagerDuty",
|
||||
"url": "https://events.pagerduty.com/v2/enqueue",
|
||||
"enabled": true,
|
||||
"auth_type": "bearer",
|
||||
"auth_token": "encrypted_token",
|
||||
"custom_headers": {
|
||||
"Content-Type": "application/json"
|
||||
},
|
||||
"alert_types": ["unexpected_port"],
|
||||
"severity_filter": ["critical"]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
**Webhook Payload Format (JSON):**
|
||||
```json
|
||||
{
|
||||
"event_type": "scan_alert",
|
||||
"alert_id": 42,
|
||||
"alert_type": "unexpected_port",
|
||||
"severity": "critical",
|
||||
"timestamp": "2025-11-17T14:23:45Z",
|
||||
"scan": {
|
||||
"scan_id": 123,
|
||||
"title": "Production Network Scan",
|
||||
"timestamp": "2025-11-17T14:15:00Z",
|
||||
"config_file": "prod_config.yaml",
|
||||
"triggered_by": "scheduled"
|
||||
},
|
||||
"alert_details": {
|
||||
"message": "Unexpected port 3306 (MySQL) exposed on 192.168.1.100",
|
||||
"ip_address": "192.168.1.100",
|
||||
"port": 3306,
|
||||
"protocol": "tcp",
|
||||
"state": "open",
|
||||
"service": {
|
||||
"name": "mysql",
|
||||
"product": "MySQL",
|
||||
"version": "8.0.32"
|
||||
},
|
||||
"expected": false,
|
||||
"site_name": "Production Servers"
|
||||
},
|
||||
"recommended_actions": [
|
||||
"Verify if MySQL should be exposed externally",
|
||||
"Check firewall rules for 192.168.1.100",
|
||||
"Review MySQL bind-address configuration"
|
||||
],
|
||||
"web_url": "https://sneakyscanner.local/scans/123"
|
||||
}
|
||||
```
|
||||
|
||||
**Webhook Features:**
|
||||
- Multiple webhook URLs with independent configuration
|
||||
- Per-webhook filtering by alert type and severity
|
||||
- Custom headers support (for API keys, auth tokens)
|
||||
- Authentication methods:
|
||||
- `none` - No authentication
|
||||
- `bearer` - Bearer token in Authorization header
|
||||
- `basic` - Basic authentication
|
||||
- `custom` - Custom header-based auth
|
||||
- Retry logic with exponential backoff (3 attempts)
|
||||
- Webhook delivery tracking (webhook_sent, webhook_sent_at, webhook_response_code)
|
||||
- Test webhook functionality in Settings UI
|
||||
- Timeout configuration (default 10 seconds)
|
||||
- Webhook delivery history and logs
|
||||
|
||||
**Webhook API Endpoints:**
|
||||
- `POST /api/webhooks` - Create webhook configuration
|
||||
- `GET /api/webhooks` - List all webhooks
|
||||
- `PUT /api/webhooks/{id}` - Update webhook configuration
|
||||
- `DELETE /api/webhooks/{id}` - Delete webhook
|
||||
- `POST /api/webhooks/{id}/test` - Send test webhook
|
||||
- `GET /api/webhooks/{id}/history` - Get delivery history
|
||||
|
||||
**Slack Integration Example:**
|
||||
Transform webhook payload to Slack message format:
|
||||
```json
|
||||
{
|
||||
"text": "SneakyScanner Alert: Unexpected Port Detected",
|
||||
"attachments": [
|
||||
{
|
||||
"color": "danger",
|
||||
"fields": [
|
||||
{"title": "IP Address", "value": "192.168.1.100", "short": true},
|
||||
{"title": "Port", "value": "3306/tcp", "short": true},
|
||||
{"title": "Service", "value": "MySQL 8.0.32", "short": true},
|
||||
{"title": "Severity", "value": "CRITICAL", "short": true}
|
||||
],
|
||||
"footer": "SneakyScanner",
|
||||
"ts": 1700234625
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
#### 4. Alert Management UI
|
||||
**Purpose:** Web interface for configuring alert rules, viewing alert history, and managing notifications.
|
||||
|
||||
**Pages:**
|
||||
- `/alerts` - Alert history with filtering and search
|
||||
- `/alerts/rules` - Alert rule configuration
|
||||
- `/settings/email` - Email notification settings
|
||||
- `/settings/webhooks` - Webhook configuration
|
||||
|
||||
**Alert History Features:**
|
||||
- Filter by alert type, severity, date range, IP address
|
||||
- Search by message content
|
||||
- Bulk acknowledge/dismiss alerts
|
||||
- Export alerts to CSV
|
||||
- Alert detail modal with full context
|
||||
|
||||
**Alert Rule UI Features:**
|
||||
- Enable/disable rules individually
|
||||
- Configure severity levels
|
||||
- Set thresholds (e.g., cert expiry days)
|
||||
- Toggle email/webhook per rule
|
||||
- Test rules against recent scans
|
||||
|
||||
#### 5. Scan Comparison
|
||||
**Purpose:** Detect infrastructure drift by comparing two scans and highlighting changes.
|
||||
|
||||
**Comparison API:**
|
||||
- `GET /api/scans/{id1}/compare/{id2}` - Compare two scans
|
||||
|
||||
**Comparison Response:**
|
||||
```json
|
||||
{
|
||||
"scan1": {"id": 100, "timestamp": "2025-11-10T10:00:00Z"},
|
||||
"scan2": {"id": 123, "timestamp": "2025-11-17T14:15:00Z"},
|
||||
"summary": {
|
||||
"new_ports": 2,
|
||||
"removed_ports": 0,
|
||||
"service_changes": 1,
|
||||
"cert_changes": 0,
|
||||
"new_hosts": 1,
|
||||
"removed_hosts": 0
|
||||
},
|
||||
"differences": {
|
||||
"new_ports": [
|
||||
{"ip": "192.168.1.100", "port": 3306, "service": "mysql"}
|
||||
],
|
||||
"removed_ports": [],
|
||||
"service_changes": [
|
||||
{
|
||||
"ip": "192.168.1.50",
|
||||
"port": 22,
|
||||
"old": "OpenSSH 8.2",
|
||||
"new": "OpenSSH 8.9"
|
||||
}
|
||||
],
|
||||
"new_hosts": [
|
||||
{"ip": "192.168.1.200", "site": "Production Servers"}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Comparison UI Features:**
|
||||
- Side-by-side comparison view
|
||||
- Color-coded differences (green=new, red=removed, yellow=changed)
|
||||
- Filter by change type
|
||||
#### Scan Comparison
|
||||
- Compare two scans API endpoint
|
||||
- Side-by-side comparison view with color-coded differences
|
||||
- Export comparison report to PDF/HTML
|
||||
- "Compare with previous scan" button on scan detail page
|
||||
|
||||
#### Enhanced Reports
|
||||
- Sortable/filterable tables (DataTables.js)
|
||||
- PDF export (WeasyPrint)
|
||||
|
||||
---
|
||||
|
||||
**Phase 5 Implementation Plan:**
|
||||
### Version 1.3.0 - Visualization
|
||||
|
||||
1. **Week 1: Alert Rule Engine**
|
||||
- Implement alert evaluation logic after scan completion
|
||||
- Create `alerts` table population
|
||||
- Add alert rule CRUD API
|
||||
- Unit tests for alert detection
|
||||
#### Timeline View
|
||||
- Visual scan history timeline
|
||||
- Filter by site/IP
|
||||
- Event annotations
|
||||
|
||||
2. **Week 2: Email Notifications**
|
||||
- SMTP integration with Flask-Mail
|
||||
- Jinja2 email templates (HTML + plain text)
|
||||
- Settings API for email configuration
|
||||
- Test email functionality
|
||||
- Email delivery tracking
|
||||
|
||||
3. **Week 3: Webhook System**
|
||||
- Webhook configuration API
|
||||
- HTTP POST delivery with retry logic
|
||||
- Webhook template system for different platforms
|
||||
- Test webhook functionality
|
||||
- Delivery tracking and logging
|
||||
|
||||
4. **Week 4: Alert UI & Scan Comparison**
|
||||
- Alert history page with filtering
|
||||
- Alert rule management UI
|
||||
- Email/webhook settings pages
|
||||
- Scan comparison API and UI
|
||||
- Integration testing
|
||||
|
||||
**Success Criteria:**
|
||||
- Alert triggered within 30 seconds of scan completion
|
||||
- Email delivered successfully to configured recipients
|
||||
- Webhook POST delivered with retry on failure
|
||||
- Scan comparison highlights all infrastructure changes
|
||||
- Zero false positives for expected ports/services
|
||||
- Alert deduplication prevents notification spam
|
||||
#### Advanced Charts
|
||||
- Port activity heatmap
|
||||
- Certificate expiration forecast
|
||||
|
||||
---
|
||||
|
||||
### Phase 6: CLI as API Client
|
||||
**Status:** Planned
|
||||
**Priority:** MEDIUM
|
||||
### Version 2.0.0 - Security Intelligence
|
||||
|
||||
**Goals:**
|
||||
- Create CLI API client for scripting and automation
|
||||
- Maintain standalone mode for testing
|
||||
- API token authentication
|
||||
|
||||
**Planned Features:**
|
||||
1. **API Client Mode:**
|
||||
- `--api-mode` flag to enable API client mode
|
||||
- `--api-url` and `--api-token` arguments
|
||||
- Trigger scans via API, poll for status, download results
|
||||
- Scans stored centrally in database
|
||||
- Standalone mode still available for testing
|
||||
|
||||
2. **API Token System:**
|
||||
- Token generation UI in settings page
|
||||
- Secure token storage (hashed in database)
|
||||
- Token authentication middleware
|
||||
- Token expiration and revocation
|
||||
|
||||
3. **Benefits:**
|
||||
- Centralized scan history accessible via web dashboard
|
||||
- No need to mount volumes for output
|
||||
- Scheduled scans managed through web UI
|
||||
- Scriptable automation while leveraging web features
|
||||
#### Vulnerability Detection
|
||||
- CVE database integration (NVD API)
|
||||
- Service version matching to known CVEs
|
||||
- CVSS severity scores
|
||||
|
||||
---
|
||||
|
||||
### Phase 7: Advanced Features
|
||||
**Status:** Future/Deferred
|
||||
**Priority:** LOW
|
||||
|
||||
**Planned Features:**
|
||||
1. **Enhanced Reports:**
|
||||
- Sortable/filterable tables (DataTables.js)
|
||||
- Inline screenshot thumbnails with lightbox
|
||||
- PDF export (WeasyPrint)
|
||||
|
||||
2. **Vulnerability Detection:**
|
||||
- CVE database integration (NVD API)
|
||||
- Service version matching to known CVEs
|
||||
- CVSS severity scores
|
||||
- Alert rules for critical CVEs
|
||||
|
||||
3. **Timeline View:**
|
||||
- Visual scan history timeline
|
||||
- Filter by site/IP
|
||||
- Event annotations
|
||||
|
||||
4. **Advanced Charts:**
|
||||
- Port activity heatmap
|
||||
- Service version tracking
|
||||
- Certificate expiration forecast
|
||||
|
||||
5. **Integrations:**
|
||||
- Slack notifications
|
||||
- Webhook support
|
||||
- Prometheus metrics export
|
||||
- CSV export/import
|
||||
|
||||
---
|
||||
|
||||
## Current Architecture
|
||||
|
||||
**Primary Interface:** Web GUI (Phases 1-4 Complete)
|
||||
- Full-featured Flask web application
|
||||
- Dashboard, scan management, scheduling, config creator
|
||||
- REST API for all operations
|
||||
- Single-user deployment with SQLite
|
||||
|
||||
**Coming Soon:** CLI API Client (Phase 6 Planned)
|
||||
- Thin client for scripting and automation
|
||||
- Calls Flask API for scan operations
|
||||
- Results stored centrally in database
|
||||
- Access to all web features via command line
|
||||
|
||||
**Core Scanning Engine:**
|
||||
- Masscan for port discovery
|
||||
- Nmap for service detection
|
||||
- Playwright for screenshots
|
||||
- sslyze for SSL/TLS analysis
|
||||
|
||||
**Deployment:**
|
||||
- Docker Compose for easy deployment
|
||||
- SQLite database (single-user, embedded)
|
||||
- Gunicorn WSGI server
|
||||
- Optional Nginx reverse proxy
|
||||
|
||||
## Development Workflow
|
||||
|
||||
### Iteration Cycle
|
||||
1. **Plan** - Define features for phase
|
||||
2. **Implement** - Code backend + frontend
|
||||
3. **Test** - Unit tests + manual testing
|
||||
4. **Deploy** - Update Docker Compose
|
||||
5. **Document** - Update README.md, ROADMAP.md
|
||||
6. **Review** - Get user feedback
|
||||
7. **Iterate** - Adjust priorities based on feedback
|
||||
|
||||
### Git Workflow
|
||||
- **main branch** - Stable releases
|
||||
- **develop branch** - Active development
|
||||
- **feature branches** - Individual features (`feature/dashboard`, `feature/scheduler`)
|
||||
- **Pull requests** - Review before merge
|
||||
|
||||
### Testing Strategy
|
||||
- **Unit tests** - pytest for models, API endpoints
|
||||
- **Integration tests** - Full scan → DB → API workflow
|
||||
- **Manual testing** - UI/UX testing in browser
|
||||
- **Performance tests** - Large scans, database queries
|
||||
|
||||
### Documentation
|
||||
- **README.md** - User-facing documentation (updated each phase)
|
||||
- **ROADMAP.md** - This file (updated as priorities shift)
|
||||
- **CLAUDE.md** - Developer documentation (architecture, code references)
|
||||
- **API.md** - API documentation (OpenAPI/Swagger in Phase 4)
|
||||
|
||||
## Resources & References
|
||||
|
||||
### Documentation
|
||||
- [Flask Documentation](https://flask.palletsprojects.com/)
|
||||
- [SQLAlchemy ORM](https://docs.sqlalchemy.org/)
|
||||
- [APScheduler](https://apscheduler.readthedocs.io/)
|
||||
- [Chart.js](https://www.chartjs.org/docs/)
|
||||
- [Bootstrap 5](https://getbootstrap.com/docs/5.3/)
|
||||
|
||||
### Tutorials
|
||||
- [Flask Mega-Tutorial](https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world)
|
||||
- [SQLAlchemy Tutorial](https://docs.sqlalchemy.org/en/20/tutorial/)
|
||||
- [APScheduler with Flask](https://github.com/viniciuschiele/flask-apscheduler)
|
||||
|
||||
### Similar Projects (Inspiration)
|
||||
- [OpenVAS](https://www.openvas.org/) - Vulnerability scanner with web UI
|
||||
- [Nessus](https://www.tenable.com/products/nessus) - Commercial scanner (inspiration for UI/UX)
|
||||
- [OWASP ZAP](https://www.zaproxy.org/) - Web app scanner (comparison reports, alerts)
|
||||
|
||||
## Changelog
|
||||
|
||||
| Date | Version | Changes |
|
||||
|------|---------|---------|
|
||||
| 2025-11-14 | 1.0 | Initial roadmap created based on user requirements |
|
||||
| 2025-11-13 | 1.1 | **Phase 1 COMPLETE** - Database schema, SQLAlchemy models, Flask app structure, settings system with encryption, Alembic migrations, API blueprints, Docker support, validation script |
|
||||
| 2025-11-14 | 1.2 | **Phase 2 COMPLETE** - REST API (5 scan endpoints, 3 settings endpoints), background jobs (APScheduler), authentication (Flask-Login), web UI (dashboard, scans, login, errors), error handling (content negotiation, request IDs, logging), 100 tests passing, comprehensive documentation (API_REFERENCE.md, DEPLOYMENT.md, PHASE2_COMPLETE.md) |
|
||||
| 2025-11-17 | 1.3 | **Bug Fix** - Fixed Chart.js infinite canvas growth issue in scan detail page (duplicate initialization, missing chart.destroy(), missing fixed-height container) |
|
||||
| 2025-11-17 | 1.4 | **Phase 4 COMPLETE** - Config Creator with CIDR-based creation, YAML editor (CodeMirror), config management UI (list/edit/delete), REST API (7 endpoints), Docker volume permissions fix, comprehensive testing and documentation |
|
||||
| 2025-11-17 | 1.5 | **Roadmap Compression** - Condensed completed phases (1-4) into concise summaries, updated project scope to emphasize web GUI frontend with CLI as API client coming soon (Phase 6), reorganized phases for clarity |
|
||||
| 2025-11-13 | 1.0.0-alpha | Phase 1 complete - Foundation |
|
||||
| 2025-11-14 | 1.0.0-beta | Phases 2-3 complete - Web App Core, Dashboard & Scheduling |
|
||||
| 2025-11-17 | 1.0.0-rc1 | Phase 4 complete - Config Creator |
|
||||
| 2025-11-19 | 1.0.0 | Phase 5 complete - Webhooks & Alerting |
|
||||
|
||||
---
|
||||
|
||||
**Last Updated:** 2025-11-17
|
||||
**Next Review:** Before Phase 5 kickoff (Email & Comparisons)
|
||||
**Last Updated:** 2025-11-20
|
||||
|
||||
99
scripts/release.sh
Executable file
99
scripts/release.sh
Executable file
@@ -0,0 +1,99 @@
|
||||
#!/bin/bash
|
||||
|
||||
# SneakyScan Release Script
|
||||
# Handles version bumping, branch merging, tagging, and pushing
|
||||
|
||||
set -e
|
||||
|
||||
CONFIG_FILE="app/web/config.py"
|
||||
DEVELOP_BRANCH="nightly"
|
||||
STAGING_BRANCH="beta"
|
||||
MAIN_BRANCH="master"
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
echo -e "${GREEN}=== SneakyScan Release Script ===${NC}\n"
|
||||
|
||||
# Ensure we're in the repo root
|
||||
if [ ! -f "$CONFIG_FILE" ]; then
|
||||
echo -e "${RED}Error: Must run from repository root${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check for uncommitted changes
|
||||
if [ -n "$(git status --porcelain)" ]; then
|
||||
echo -e "${RED}Error: You have uncommitted changes. Please commit or stash them first.${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Prompt for version
|
||||
read -p "Enter version (e.g., 1.0.0): " VERSION
|
||||
|
||||
# Validate version format (semver)
|
||||
if ! [[ "$VERSION" =~ ^[0-9]+\.[0-9]+\.[0-9]+(-[a-zA-Z0-9]+)?$ ]]; then
|
||||
echo -e "${RED}Error: Invalid version format. Use semver (e.g., 1.0.0 or 1.0.0-beta)${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
TAG_NAME="v$VERSION"
|
||||
|
||||
# Check if tag already exists
|
||||
if git rev-parse "$TAG_NAME" >/dev/null 2>&1; then
|
||||
echo -e "${RED}Error: Tag $TAG_NAME already exists${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo -e "\n${YELLOW}Release version: $VERSION${NC}"
|
||||
echo -e "${YELLOW}Tag name: $TAG_NAME${NC}\n"
|
||||
|
||||
read -p "Proceed with release? (y/n): " CONFIRM
|
||||
if [ "$CONFIRM" != "y" ]; then
|
||||
echo "Release cancelled."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Fetch latest from remote
|
||||
echo -e "\n${GREEN}Fetching latest from remote...${NC}"
|
||||
git fetch origin
|
||||
|
||||
# Update version in config.py
|
||||
echo -e "\n${GREEN}Updating version in $CONFIG_FILE...${NC}"
|
||||
sed -i "s/APP_VERSION = .*/APP_VERSION = '$VERSION'/" "$CONFIG_FILE"
|
||||
|
||||
# Checkout develop and commit version change
|
||||
echo -e "\n${GREEN}Committing version change on $DEVELOP_BRANCH...${NC}"
|
||||
git checkout "$DEVELOP_BRANCH"
|
||||
git add "$CONFIG_FILE"
|
||||
git commit -m "Bump version to $VERSION"
|
||||
|
||||
# Merge develop into staging
|
||||
echo -e "\n${GREEN}Merging $DEVELOP_BRANCH into $STAGING_BRANCH...${NC}"
|
||||
git checkout "$STAGING_BRANCH"
|
||||
git merge "$DEVELOP_BRANCH" -m "Merge $DEVELOP_BRANCH into $STAGING_BRANCH for release $VERSION"
|
||||
|
||||
# Merge staging into main
|
||||
echo -e "\n${GREEN}Merging $STAGING_BRANCH into $MAIN_BRANCH...${NC}"
|
||||
git checkout "$MAIN_BRANCH"
|
||||
git merge "$STAGING_BRANCH" -m "Merge $STAGING_BRANCH into $MAIN_BRANCH for release $VERSION"
|
||||
|
||||
# Create tag
|
||||
echo -e "\n${GREEN}Creating tag $TAG_NAME...${NC}"
|
||||
git tag -a "$TAG_NAME" -m "Release $VERSION"
|
||||
|
||||
# Push everything
|
||||
echo -e "\n${GREEN}Pushing branches and tag to remote...${NC}"
|
||||
git push origin "$DEVELOP_BRANCH"
|
||||
git push origin "$STAGING_BRANCH"
|
||||
git push origin "$MAIN_BRANCH"
|
||||
git push origin "$TAG_NAME"
|
||||
|
||||
# Return to develop branch
|
||||
git checkout "$DEVELOP_BRANCH"
|
||||
|
||||
echo -e "\n${GREEN}=== Release $VERSION complete! ===${NC}"
|
||||
echo -e "Tag: $TAG_NAME"
|
||||
echo -e "All branches and tags have been pushed to origin."
|
||||
154
setup.sh
Executable file
154
setup.sh
Executable file
@@ -0,0 +1,154 @@
|
||||
#!/bin/bash
|
||||
set -e
|
||||
|
||||
# SneakyScanner First-Run Setup Script
|
||||
# This script helps you get started quickly with SneakyScanner
|
||||
|
||||
echo "================================================"
|
||||
echo " SneakyScanner - First Run Setup"
|
||||
echo "================================================"
|
||||
echo ""
|
||||
|
||||
# Function to generate random key for Flask SECRET_KEY
|
||||
generate_secret_key() {
|
||||
openssl rand -hex 32 2>/dev/null || python3 -c "import secrets; print(secrets.token_hex(32))"
|
||||
}
|
||||
|
||||
# Function to generate Fernet encryption key (32 url-safe base64-encoded bytes)
|
||||
generate_fernet_key() {
|
||||
python3 -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())" 2>/dev/null || \
|
||||
openssl rand -base64 32 | head -c 44
|
||||
}
|
||||
|
||||
# Check if .env exists
|
||||
if [ -f .env ]; then
|
||||
echo "✓ .env file already exists"
|
||||
read -p "Do you want to regenerate it? (y/N): " REGENERATE
|
||||
if [ "$REGENERATE" != "y" ] && [ "$REGENERATE" != "Y" ]; then
|
||||
echo "Skipping .env creation..."
|
||||
SKIP_ENV=true
|
||||
fi
|
||||
fi
|
||||
|
||||
# Create or update .env
|
||||
if [ "$SKIP_ENV" != "true" ]; then
|
||||
echo ""
|
||||
echo "Creating .env file..."
|
||||
|
||||
# Generate secure keys
|
||||
SECRET_KEY=$(generate_secret_key)
|
||||
ENCRYPTION_KEY=$(generate_fernet_key)
|
||||
|
||||
# Ask for initial password
|
||||
echo ""
|
||||
echo "Set an initial password for the web interface:"
|
||||
read -s -p "Password (or press Enter to generate random password): " INITIAL_PASSWORD
|
||||
echo ""
|
||||
|
||||
if [ -z "$INITIAL_PASSWORD" ]; then
|
||||
echo "Generating random password..."
|
||||
# Generate a 32-character alphanumeric password
|
||||
INITIAL_PASSWORD=$(cat /dev/urandom | tr -dc 'A-Za-z0-9' | head -c 32)
|
||||
# Save password to file in project root (avoid permission issues with mounted volumes)
|
||||
echo "$INITIAL_PASSWORD" > admin_password.txt
|
||||
echo "✓ Random password generated and saved to: ./admin_password.txt"
|
||||
PASSWORD_SAVED=true
|
||||
fi
|
||||
|
||||
# Create .env file
|
||||
cat > .env << EOF
|
||||
# Flask Configuration
|
||||
FLASK_ENV=production
|
||||
FLASK_DEBUG=false
|
||||
|
||||
# Security Keys (randomly generated)
|
||||
SECRET_KEY=$SECRET_KEY
|
||||
SNEAKYSCANNER_ENCRYPTION_KEY=$ENCRYPTION_KEY
|
||||
|
||||
# Initial Password
|
||||
INITIAL_PASSWORD=$INITIAL_PASSWORD
|
||||
|
||||
# Database Configuration
|
||||
DATABASE_URL=sqlite:////app/data/sneakyscanner.db
|
||||
|
||||
# Optional: Logging
|
||||
LOG_LEVEL=INFO
|
||||
|
||||
# Optional: CORS (comma-separated origins, or * for all)
|
||||
CORS_ORIGINS=*
|
||||
EOF
|
||||
|
||||
echo "✓ .env file created with secure keys"
|
||||
|
||||
# Remove the init marker so the password gets set on next container start
|
||||
rm -f data/.db_initialized
|
||||
echo "✓ Password will be updated on next container start"
|
||||
fi
|
||||
|
||||
# Create required directories
|
||||
echo ""
|
||||
echo "Creating required directories..."
|
||||
mkdir -p data logs output configs
|
||||
echo "✓ Directories created"
|
||||
|
||||
# Check if Docker is running
|
||||
echo ""
|
||||
echo "Checking Docker..."
|
||||
if ! docker info > /dev/null 2>&1; then
|
||||
echo "✗ Docker is not running or not installed"
|
||||
echo "Please install Docker and start the Docker daemon"
|
||||
exit 1
|
||||
fi
|
||||
echo "✓ Docker is running"
|
||||
|
||||
# Build and start
|
||||
echo ""
|
||||
echo "Building and starting SneakyScanner..."
|
||||
echo "This may take a few minutes on first run..."
|
||||
echo ""
|
||||
|
||||
docker compose build
|
||||
|
||||
echo ""
|
||||
echo "Starting SneakyScanner..."
|
||||
docker compose up -d
|
||||
|
||||
# Wait for service to be healthy
|
||||
echo ""
|
||||
echo "Waiting for application to start..."
|
||||
sleep 5
|
||||
|
||||
# Check if container is running
|
||||
if docker ps | grep -q sneakyscanner-web; then
|
||||
echo ""
|
||||
echo "================================================"
|
||||
echo " ✓ SneakyScanner is Running!"
|
||||
echo "================================================"
|
||||
echo ""
|
||||
echo "Web Interface: http://localhost:5000"
|
||||
echo ""
|
||||
echo "Login with:"
|
||||
if [ -z "$SKIP_ENV" ]; then
|
||||
if [ "$PASSWORD_SAVED" = "true" ]; then
|
||||
echo " Password saved in: ./admin_password.txt"
|
||||
echo " Password: $INITIAL_PASSWORD"
|
||||
else
|
||||
echo " Password: $INITIAL_PASSWORD"
|
||||
fi
|
||||
else
|
||||
echo " Password: (check your .env file or ./admin_password.txt)"
|
||||
fi
|
||||
echo ""
|
||||
echo "Useful commands:"
|
||||
echo " docker compose logs -f # View logs"
|
||||
echo " docker compose stop # Stop the service"
|
||||
echo " docker compose restart # Restart the service"
|
||||
echo ""
|
||||
echo "⚠ IMPORTANT: Change your password after first login!"
|
||||
echo "================================================"
|
||||
else
|
||||
echo ""
|
||||
echo "✗ Container failed to start. Check logs with:"
|
||||
echo " docker compose logs"
|
||||
exit 1
|
||||
fi
|
||||
Reference in New Issue
Block a user