Complete Phase 1: Foundation - Flask web application infrastructure
Implement complete database schema and Flask application structure for SneakyScan web interface. This establishes the foundation for web-based scan management, scheduling, and visualization. Database & ORM: - Add 11 SQLAlchemy models for comprehensive scan data storage (Scan, ScanSite, ScanIP, ScanPort, ScanService, ScanCertificate, ScanTLSVersion, Schedule, Alert, AlertRule, Setting) - Configure Alembic migrations system with initial schema migration - Add init_db.py script for database initialization and password setup - Support both migration-based and direct table creation Settings System: - Implement SettingsManager with automatic encryption for sensitive values - Add Fernet encryption for SMTP passwords and API tokens - Implement PasswordManager with bcrypt password hashing (work factor 12) - Initialize default settings for SMTP, authentication, and retention Flask Application: - Create Flask app factory pattern with scoped session management - Add 4 API blueprints: scans, schedules, alerts, settings - Implement functional Settings API (GET/PUT/DELETE endpoints) - Add CORS support, error handlers, and request/response logging - Configure development and production logging to file and console Docker & Deployment: - Update Dockerfile to install Flask dependencies - Add docker-compose-web.yml for web application deployment - Configure volume mounts for database, output, and logs persistence - Expose port 5000 for Flask web server Testing & Validation: - Add validate_phase1.py script to verify all deliverables - Validate directory structure, Python syntax, models, and endpoints - All validation checks passing Documentation: - Add PHASE1_COMPLETE.md with comprehensive Phase 1 summary - Update ROADMAP.md with Phase 1 completion status - Update .gitignore to exclude database files and documentation Files changed: 21 files - New: web/ directory with complete Flask app structure - New: migrations/ with Alembic configuration - New: requirements-web.txt with Flask dependencies - Modified: Dockerfile, ROADMAP.md, .gitignore
This commit is contained in:
9
.gitignore
vendored
9
.gitignore
vendored
@@ -1,6 +1,14 @@
|
|||||||
# Output files (scan reports and screenshots)
|
# Output files (scan reports and screenshots)
|
||||||
output/
|
output/
|
||||||
|
|
||||||
|
# Database files
|
||||||
|
*.db
|
||||||
|
*.db-journal
|
||||||
|
*.db-shm
|
||||||
|
*.db-wal
|
||||||
|
data/
|
||||||
|
logs/
|
||||||
|
|
||||||
# Python
|
# Python
|
||||||
__pycache__/
|
__pycache__/
|
||||||
*.py[cod]
|
*.py[cod]
|
||||||
@@ -21,6 +29,7 @@ ENV/
|
|||||||
|
|
||||||
#AI helpers
|
#AI helpers
|
||||||
.claude/
|
.claude/
|
||||||
|
CLAUDE.md
|
||||||
|
|
||||||
# OS
|
# OS
|
||||||
.DS_Store
|
.DS_Store
|
||||||
|
|||||||
23
Dockerfile
23
Dockerfile
@@ -24,7 +24,9 @@ WORKDIR /app
|
|||||||
|
|
||||||
# Copy requirements and install Python dependencies
|
# Copy requirements and install Python dependencies
|
||||||
COPY requirements.txt .
|
COPY requirements.txt .
|
||||||
RUN pip install --no-cache-dir -r requirements.txt
|
COPY requirements-web.txt .
|
||||||
|
RUN pip install --no-cache-dir -r requirements.txt && \
|
||||||
|
pip install --no-cache-dir -r requirements-web.txt
|
||||||
|
|
||||||
# Install Playwright browsers (Chromium only)
|
# Install Playwright browsers (Chromium only)
|
||||||
# Note: We skip --with-deps since we already installed system chromium and dependencies above
|
# Note: We skip --with-deps since we already installed system chromium and dependencies above
|
||||||
@@ -33,16 +35,25 @@ RUN playwright install chromium
|
|||||||
# Copy application code
|
# Copy application code
|
||||||
COPY src/ ./src/
|
COPY src/ ./src/
|
||||||
COPY templates/ ./templates/
|
COPY templates/ ./templates/
|
||||||
|
COPY web/ ./web/
|
||||||
|
COPY migrations/ ./migrations/
|
||||||
|
COPY alembic.ini .
|
||||||
|
COPY init_db.py .
|
||||||
|
|
||||||
# Create output directory
|
# Create required directories
|
||||||
RUN mkdir -p /app/output
|
RUN mkdir -p /app/output /app/logs
|
||||||
|
|
||||||
# Make scanner executable
|
# Make scripts executable
|
||||||
RUN chmod +x /app/src/scanner.py
|
RUN chmod +x /app/src/scanner.py /app/init_db.py
|
||||||
|
|
||||||
# Force Python unbuffered output
|
# Force Python unbuffered output
|
||||||
ENV PYTHONUNBUFFERED=1
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
|
||||||
# Set entry point with unbuffered Python
|
# Expose Flask web app port
|
||||||
|
EXPOSE 5000
|
||||||
|
|
||||||
|
# Default entry point is the scanner (backward compatibility)
|
||||||
|
# To run the Flask web app, override with: docker run --entrypoint python3 sneakyscanner -m web.app
|
||||||
|
# To initialize the database, use: docker run --entrypoint python3 sneakyscanner init_db.py
|
||||||
ENTRYPOINT ["python3", "-u", "/app/src/scanner.py"]
|
ENTRYPOINT ["python3", "-u", "/app/src/scanner.py"]
|
||||||
CMD ["--help"]
|
CMD ["--help"]
|
||||||
|
|||||||
404
PHASE1_COMPLETE.md
Normal file
404
PHASE1_COMPLETE.md
Normal file
@@ -0,0 +1,404 @@
|
|||||||
|
# Phase 1: Foundation - COMPLETE ✓
|
||||||
|
|
||||||
|
**Date Completed:** 2025-11-13
|
||||||
|
|
||||||
|
Phase 1 of the SneakyScanner roadmap has been successfully implemented. This document summarizes what was delivered and how to use the new infrastructure.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ✓ Deliverables Completed
|
||||||
|
|
||||||
|
### 1. Database Schema & Models
|
||||||
|
- **SQLAlchemy models** for all 11 database tables (`web/models.py`)
|
||||||
|
- Core tables: `Scan`, `ScanSite`, `ScanIP`, `ScanPort`, `ScanService`, `ScanCertificate`, `ScanTLSVersion`
|
||||||
|
- Scheduling tables: `Schedule`, `Alert`, `AlertRule`
|
||||||
|
- Configuration: `Setting`
|
||||||
|
- **Alembic migrations** system configured (`migrations/`)
|
||||||
|
- **Initial migration** created (`migrations/versions/001_initial_schema.py`)
|
||||||
|
|
||||||
|
### 2. Settings System with Encryption
|
||||||
|
- **SettingsManager** class with CRUD operations (`web/utils/settings.py`)
|
||||||
|
- **Automatic encryption** for sensitive values (SMTP passwords, API tokens)
|
||||||
|
- **PasswordManager** for bcrypt password hashing
|
||||||
|
- **Default settings initialization** for SMTP, authentication, retention policies
|
||||||
|
|
||||||
|
### 3. Flask Application Structure
|
||||||
|
- **Flask app factory** pattern implemented (`web/app.py`)
|
||||||
|
- **API blueprints** for all major endpoints:
|
||||||
|
- `/api/scans` - Scan management (stub for Phase 2)
|
||||||
|
- `/api/schedules` - Schedule management (stub for Phase 3)
|
||||||
|
- `/api/alerts` - Alert management (stub for Phase 4)
|
||||||
|
- `/api/settings` - Settings API (functional in Phase 1!)
|
||||||
|
- **Error handlers** for common HTTP status codes
|
||||||
|
- **CORS support** for API access
|
||||||
|
- **Logging** to file and console
|
||||||
|
- **Database session management** with scoped sessions
|
||||||
|
|
||||||
|
### 4. Database Initialization
|
||||||
|
- **init_db.py** script for easy database setup
|
||||||
|
- Supports both Alembic migrations and direct table creation
|
||||||
|
- Password setting during initialization
|
||||||
|
- Database verification and settings display
|
||||||
|
|
||||||
|
### 5. Docker Support
|
||||||
|
- **Updated Dockerfile** with Flask dependencies
|
||||||
|
- **docker-compose-web.yml** for running the web application
|
||||||
|
- Separate service definition for database initialization
|
||||||
|
- Volume mounts for persistence (database, output, logs)
|
||||||
|
|
||||||
|
### 6. Validation & Testing
|
||||||
|
- **validate_phase1.py** script to verify all deliverables
|
||||||
|
- Validates directory structure, files, Python syntax, models, and API endpoints
|
||||||
|
- All checks passing ✓
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📁 New Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
SneakyScanner/
|
||||||
|
├── web/ # Flask web application (NEW)
|
||||||
|
│ ├── __init__.py
|
||||||
|
│ ├── app.py # Flask app factory
|
||||||
|
│ ├── models.py # SQLAlchemy models (11 tables)
|
||||||
|
│ ├── api/ # API blueprints
|
||||||
|
│ │ ├── __init__.py
|
||||||
|
│ │ ├── scans.py # Scans API
|
||||||
|
│ │ ├── schedules.py # Schedules API
|
||||||
|
│ │ ├── alerts.py # Alerts API
|
||||||
|
│ │ └── settings.py # Settings API (functional!)
|
||||||
|
│ ├── templates/ # Jinja2 templates (for Phase 3)
|
||||||
|
│ ├── static/ # CSS, JS, images (for Phase 3)
|
||||||
|
│ │ ├── css/
|
||||||
|
│ │ ├── js/
|
||||||
|
│ │ └── images/
|
||||||
|
│ └── utils/ # Utility modules
|
||||||
|
│ ├── __init__.py
|
||||||
|
│ └── settings.py # Settings manager with encryption
|
||||||
|
├── migrations/ # Alembic migrations (NEW)
|
||||||
|
│ ├── env.py # Alembic environment
|
||||||
|
│ ├── script.py.mako # Migration template
|
||||||
|
│ └── versions/
|
||||||
|
│ └── 001_initial_schema.py # Initial database migration
|
||||||
|
├── alembic.ini # Alembic configuration (NEW)
|
||||||
|
├── init_db.py # Database initialization script (NEW)
|
||||||
|
├── validate_phase1.py # Phase 1 validation script (NEW)
|
||||||
|
├── requirements-web.txt # Flask dependencies (NEW)
|
||||||
|
├── docker-compose-web.yml # Docker Compose for web app (NEW)
|
||||||
|
├── Dockerfile # Updated with Flask support
|
||||||
|
├── src/ # Existing scanner code (unchanged)
|
||||||
|
├── templates/ # Existing report templates (unchanged)
|
||||||
|
├── configs/ # Existing YAML configs (unchanged)
|
||||||
|
└── output/ # Existing scan outputs (unchanged)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Getting Started
|
||||||
|
|
||||||
|
### Option 1: Local Development (without Docker)
|
||||||
|
|
||||||
|
#### 1. Install Dependencies
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install Flask and web dependencies
|
||||||
|
pip install -r requirements-web.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Initialize Database
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create database and set password
|
||||||
|
python3 init_db.py --password YOUR_SECURE_PASSWORD
|
||||||
|
|
||||||
|
# Verify database
|
||||||
|
python3 init_db.py --verify-only
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Run Flask Application
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run development server
|
||||||
|
python3 -m web.app
|
||||||
|
|
||||||
|
# Application will be available at http://localhost:5000
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. Test API Endpoints
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Health check
|
||||||
|
curl http://localhost:5000/api/settings/health
|
||||||
|
|
||||||
|
# Get all settings (sanitized)
|
||||||
|
curl http://localhost:5000/api/settings
|
||||||
|
|
||||||
|
# Get specific setting
|
||||||
|
curl http://localhost:5000/api/settings/smtp_server
|
||||||
|
|
||||||
|
# Update a setting
|
||||||
|
curl -X PUT http://localhost:5000/api/settings/smtp_server \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"value": "smtp.gmail.com"}'
|
||||||
|
|
||||||
|
# Set application password
|
||||||
|
curl -X POST http://localhost:5000/api/settings/password \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"password": "newsecurepassword"}'
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Option 2: Docker Deployment
|
||||||
|
|
||||||
|
#### 1. Build Docker Image
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker-compose -f docker-compose-web.yml build
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Initialize Database (one-time)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create data directory
|
||||||
|
mkdir -p data
|
||||||
|
|
||||||
|
# Initialize database
|
||||||
|
docker-compose -f docker-compose-web.yml run --rm init-db --password YOUR_SECURE_PASSWORD
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Run Web Application
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start Flask web server
|
||||||
|
docker-compose -f docker-compose-web.yml up -d web
|
||||||
|
|
||||||
|
# View logs
|
||||||
|
docker-compose -f docker-compose-web.yml logs -f web
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. Access Application
|
||||||
|
|
||||||
|
- Web API: http://localhost:5000
|
||||||
|
- Health checks:
|
||||||
|
- http://localhost:5000/api/scans/health
|
||||||
|
- http://localhost:5000/api/schedules/health
|
||||||
|
- http://localhost:5000/api/alerts/health
|
||||||
|
- http://localhost:5000/api/settings/health
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔐 Security Features
|
||||||
|
|
||||||
|
### Encryption
|
||||||
|
- **Fernet encryption** for sensitive settings (SMTP passwords, API tokens)
|
||||||
|
- Encryption key auto-generated and stored in settings table
|
||||||
|
- Can be overridden via `SNEAKYSCANNER_ENCRYPTION_KEY` environment variable
|
||||||
|
|
||||||
|
### Password Hashing
|
||||||
|
- **Bcrypt** for application password hashing (work factor 12)
|
||||||
|
- Password stored as irreversible hash in settings table
|
||||||
|
- Minimum 8 characters enforced
|
||||||
|
|
||||||
|
### Session Management
|
||||||
|
- Flask sessions with configurable `SECRET_KEY`
|
||||||
|
- Set via environment variable or config
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Database Schema
|
||||||
|
|
||||||
|
### Core Tables
|
||||||
|
- **scans** - Scan metadata and status
|
||||||
|
- **scan_sites** - Site groupings
|
||||||
|
- **scan_ips** - IP addresses scanned
|
||||||
|
- **scan_ports** - Discovered ports
|
||||||
|
- **scan_services** - Service detection results
|
||||||
|
- **scan_certificates** - SSL/TLS certificates
|
||||||
|
- **scan_tls_versions** - TLS version support
|
||||||
|
|
||||||
|
### Scheduling & Alerts
|
||||||
|
- **schedules** - Cron-like scan schedules
|
||||||
|
- **alerts** - Alert history
|
||||||
|
- **alert_rules** - Alert rule definitions
|
||||||
|
|
||||||
|
### Configuration
|
||||||
|
- **settings** - Application settings (key-value store)
|
||||||
|
|
||||||
|
All tables include proper foreign keys, indexes, and cascade delete rules.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🧪 Validation
|
||||||
|
|
||||||
|
Run the Phase 1 validation script to verify everything is in place:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python3 validate_phase1.py
|
||||||
|
```
|
||||||
|
|
||||||
|
Expected output:
|
||||||
|
```
|
||||||
|
✓ All Phase 1 validation checks passed!
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔧 Environment Variables
|
||||||
|
|
||||||
|
Configure the Flask app via environment variables:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Flask configuration
|
||||||
|
export FLASK_ENV=development
|
||||||
|
export FLASK_DEBUG=true
|
||||||
|
export FLASK_HOST=0.0.0.0
|
||||||
|
export FLASK_PORT=5000
|
||||||
|
|
||||||
|
# Database
|
||||||
|
export DATABASE_URL=sqlite:///./sneakyscanner.db
|
||||||
|
|
||||||
|
# Security
|
||||||
|
export SECRET_KEY=your-secret-key-here
|
||||||
|
export SNEAKYSCANNER_ENCRYPTION_KEY=your-encryption-key-here
|
||||||
|
|
||||||
|
# CORS (comma-separated origins)
|
||||||
|
export CORS_ORIGINS=http://localhost:3000,https://your-domain.com
|
||||||
|
|
||||||
|
# Logging
|
||||||
|
export LOG_LEVEL=INFO
|
||||||
|
```
|
||||||
|
|
||||||
|
Or use a `.env` file (supported via `python-dotenv`).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📝 API Endpoints Summary
|
||||||
|
|
||||||
|
### Settings API (Functional in Phase 1)
|
||||||
|
| Method | Endpoint | Description | Status |
|
||||||
|
|--------|----------|-------------|--------|
|
||||||
|
| GET | `/api/settings` | Get all settings (sanitized) | ✓ Working |
|
||||||
|
| PUT | `/api/settings` | Update multiple settings | ✓ Working |
|
||||||
|
| GET | `/api/settings/{key}` | Get specific setting | ✓ Working |
|
||||||
|
| PUT | `/api/settings/{key}` | Update specific setting | ✓ Working |
|
||||||
|
| DELETE | `/api/settings/{key}` | Delete setting | ✓ Working |
|
||||||
|
| POST | `/api/settings/password` | Set app password | ✓ Working |
|
||||||
|
| GET | `/api/settings/health` | Health check | ✓ Working |
|
||||||
|
|
||||||
|
### Scans API (Stubs for Phase 2)
|
||||||
|
| Method | Endpoint | Description | Status |
|
||||||
|
|--------|----------|-------------|--------|
|
||||||
|
| GET | `/api/scans` | List scans | Phase 2 |
|
||||||
|
| GET | `/api/scans/{id}` | Get scan details | Phase 2 |
|
||||||
|
| POST | `/api/scans` | Trigger scan | Phase 2 |
|
||||||
|
| DELETE | `/api/scans/{id}` | Delete scan | Phase 2 |
|
||||||
|
| GET | `/api/scans/{id}/status` | Get scan status | Phase 2 |
|
||||||
|
| GET | `/api/scans/health` | Health check | ✓ Working |
|
||||||
|
|
||||||
|
### Schedules API (Stubs for Phase 3)
|
||||||
|
| Method | Endpoint | Description | Status |
|
||||||
|
|--------|----------|-------------|--------|
|
||||||
|
| GET | `/api/schedules` | List schedules | Phase 3 |
|
||||||
|
| POST | `/api/schedules` | Create schedule | Phase 3 |
|
||||||
|
| PUT | `/api/schedules/{id}` | Update schedule | Phase 3 |
|
||||||
|
| DELETE | `/api/schedules/{id}` | Delete schedule | Phase 3 |
|
||||||
|
| POST | `/api/schedules/{id}/trigger` | Trigger schedule | Phase 3 |
|
||||||
|
| GET | `/api/schedules/health` | Health check | ✓ Working |
|
||||||
|
|
||||||
|
### Alerts API (Stubs for Phase 4)
|
||||||
|
| Method | Endpoint | Description | Status |
|
||||||
|
|--------|----------|-------------|--------|
|
||||||
|
| GET | `/api/alerts` | List alerts | Phase 4 |
|
||||||
|
| GET | `/api/alerts/rules` | List alert rules | Phase 4 |
|
||||||
|
| POST | `/api/alerts/rules` | Create alert rule | Phase 4 |
|
||||||
|
| PUT | `/api/alerts/rules/{id}` | Update alert rule | Phase 4 |
|
||||||
|
| DELETE | `/api/alerts/rules/{id}` | Delete alert rule | Phase 4 |
|
||||||
|
| GET | `/api/alerts/health` | Health check | ✓ Working |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ✅ Testing Checklist
|
||||||
|
|
||||||
|
- [x] Database creates successfully
|
||||||
|
- [x] Settings can be stored/retrieved
|
||||||
|
- [x] Encryption works for sensitive values
|
||||||
|
- [x] Password hashing works
|
||||||
|
- [x] Flask app starts without errors
|
||||||
|
- [x] API blueprints load correctly
|
||||||
|
- [x] Health check endpoints respond
|
||||||
|
- [x] All Python files have valid syntax
|
||||||
|
- [x] All models defined correctly
|
||||||
|
- [x] Database migrations work
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Next Steps: Phase 2
|
||||||
|
|
||||||
|
Phase 2 will implement:
|
||||||
|
1. **REST API for scans** - Trigger scans, list history, get results
|
||||||
|
2. **Background job queue** - APScheduler for async scan execution
|
||||||
|
3. **Authentication** - Flask-Login for session management
|
||||||
|
4. **Scanner integration** - Save scan results to database
|
||||||
|
5. **Docker Compose deployment** - Production-ready setup
|
||||||
|
|
||||||
|
Estimated timeline: 2 weeks (as per roadmap)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 References
|
||||||
|
|
||||||
|
### Key Files
|
||||||
|
- `web/models.py` - Database models (lines 1-400+)
|
||||||
|
- `web/app.py` - Flask app factory (lines 1-250+)
|
||||||
|
- `web/utils/settings.py` - Settings manager (lines 1-300+)
|
||||||
|
- `init_db.py` - Database initialization (lines 1-200+)
|
||||||
|
- `migrations/versions/001_initial_schema.py` - Initial migration (lines 1-250+)
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
- [Flask Documentation](https://flask.palletsprojects.com/)
|
||||||
|
- [SQLAlchemy ORM](https://docs.sqlalchemy.org/)
|
||||||
|
- [Alembic Migrations](https://alembic.sqlalchemy.org/)
|
||||||
|
- [Cryptography Library](https://cryptography.io/)
|
||||||
|
- [Bcrypt](https://github.com/pyca/bcrypt)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🐛 Troubleshooting
|
||||||
|
|
||||||
|
### Database Issues
|
||||||
|
```bash
|
||||||
|
# Reset database
|
||||||
|
rm sneakyscanner.db
|
||||||
|
python3 init_db.py --password newpassword
|
||||||
|
|
||||||
|
# Check database
|
||||||
|
sqlite3 sneakyscanner.db ".schema"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Flask Won't Start
|
||||||
|
```bash
|
||||||
|
# Check dependencies installed
|
||||||
|
pip list | grep -i flask
|
||||||
|
|
||||||
|
# Check syntax errors
|
||||||
|
python3 validate_phase1.py
|
||||||
|
|
||||||
|
# Run with debug output
|
||||||
|
FLASK_DEBUG=true python3 -m web.app
|
||||||
|
```
|
||||||
|
|
||||||
|
### Encryption Errors
|
||||||
|
```bash
|
||||||
|
# Generate new encryption key
|
||||||
|
python3 -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
|
||||||
|
|
||||||
|
# Set in environment
|
||||||
|
export SNEAKYSCANNER_ENCRYPTION_KEY="your-key-here"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Phase 1 Status:** ✅ COMPLETE
|
||||||
|
|
||||||
|
All deliverables implemented, tested, and validated. Ready to proceed with Phase 2.
|
||||||
120
ROADMAP.md
120
ROADMAP.md
@@ -1,5 +1,19 @@
|
|||||||
# SneakyScanner Roadmap
|
# SneakyScanner Roadmap
|
||||||
|
|
||||||
|
**Status:** Phase 1 Complete ✅ | Phase 2 Ready to Start
|
||||||
|
|
||||||
|
## Progress Overview
|
||||||
|
- ✅ **Phase 1: Foundation** - Complete (2025-11-13)
|
||||||
|
- Database schema & SQLAlchemy models
|
||||||
|
- Settings system with encryption
|
||||||
|
- Flask app structure with API blueprints
|
||||||
|
- Docker deployment support
|
||||||
|
- ⏳ **Phase 2: Flask Web App Core** - Next up (Weeks 3-4)
|
||||||
|
- 📋 **Phase 3: Dashboard & Scheduling** - Planned (Weeks 5-6)
|
||||||
|
- 📋 **Phase 4: Email & Comparisons** - Planned (Weeks 7-8)
|
||||||
|
- 📋 **Phase 5: CLI as API Client** - Planned (Week 9)
|
||||||
|
- 📋 **Phase 6: Advanced Features** - Planned (Weeks 10+)
|
||||||
|
|
||||||
## Vision & Goals
|
## Vision & Goals
|
||||||
|
|
||||||
SneakyScanner is evolving from a CLI-based network scanning tool into a comprehensive **Flask web application** for infrastructure monitoring and security auditing. The web application will provide:
|
SneakyScanner is evolving from a CLI-based network scanning tool into a comprehensive **Flask web application** for infrastructure monitoring and security auditing. The web application will provide:
|
||||||
@@ -336,58 +350,83 @@ All API endpoints return JSON and follow RESTful conventions.
|
|||||||
|
|
||||||
## Phased Roadmap
|
## Phased Roadmap
|
||||||
|
|
||||||
### Phase 1: Foundation (Weeks 1-2)
|
### Phase 1: Foundation ✅ COMPLETE
|
||||||
|
**Completed:** 2025-11-13
|
||||||
**Priority: CRITICAL** - Database and settings infrastructure
|
**Priority: CRITICAL** - Database and settings infrastructure
|
||||||
|
|
||||||
**Goals:**
|
**Goals:**
|
||||||
- Establish database schema
|
- ✅ Establish database schema
|
||||||
- Create settings system
|
- ✅ Create settings system
|
||||||
- Set up Flask project structure
|
- ✅ Set up Flask project structure
|
||||||
|
|
||||||
**Tasks:**
|
**Tasks:**
|
||||||
1. Create SQLite database schema (use Alembic for migrations)
|
1. ✅ Create SQLite database schema (use Alembic for migrations)
|
||||||
2. Implement SQLAlchemy models for all tables
|
2. ✅ Implement SQLAlchemy models for all tables (11 models)
|
||||||
3. Create database initialization script (`init_db.py`)
|
3. ✅ Create database initialization script (`init_db.py`)
|
||||||
4. Implement settings system:
|
4. ✅ Implement settings system:
|
||||||
- Settings model with get/set methods
|
- ✅ Settings model with get/set methods
|
||||||
- Default settings initialization
|
- ✅ Default settings initialization
|
||||||
- Encrypted storage for passwords (cryptography library)
|
- ✅ Encrypted storage for passwords (cryptography library + bcrypt)
|
||||||
5. Set up Flask project structure:
|
- ✅ PasswordManager for bcrypt password hashing
|
||||||
|
5. ✅ Set up Flask project structure:
|
||||||
```
|
```
|
||||||
SneakyScanner/
|
SneakyScanner/
|
||||||
├── src/
|
├── src/
|
||||||
│ ├── scanner.py (existing)
|
│ ├── scanner.py (existing)
|
||||||
│ ├── screenshot_capture.py (existing)
|
│ ├── screenshot_capture.py (existing)
|
||||||
│ └── report_generator.py (existing)
|
│ └── report_generator.py (existing)
|
||||||
├── web/
|
├── web/ ✅ CREATED
|
||||||
│ ├── app.py (Flask app factory)
|
│ ├── __init__.py ✅
|
||||||
│ ├── models.py (SQLAlchemy models)
|
│ ├── app.py (Flask app factory) ✅
|
||||||
│ ├── api/ (API blueprints)
|
│ ├── models.py (SQLAlchemy models) ✅
|
||||||
│ │ ├── scans.py
|
│ ├── api/ (API blueprints) ✅
|
||||||
│ │ ├── schedules.py
|
│ │ ├── __init__.py ✅
|
||||||
│ │ ├── alerts.py
|
│ │ ├── scans.py ✅
|
||||||
│ │ └── settings.py
|
│ │ ├── schedules.py ✅
|
||||||
│ ├── templates/ (Jinja2 templates)
|
│ │ ├── alerts.py ✅
|
||||||
│ ├── static/ (CSS, JS, images)
|
│ │ └── settings.py ✅ (Fully functional!)
|
||||||
│ └── utils/ (helpers, decorators)
|
│ ├── templates/ (Jinja2 templates) ✅
|
||||||
├── migrations/ (Alembic migrations)
|
│ ├── static/ (CSS, JS, images) ✅
|
||||||
|
│ │ ├── css/ ✅
|
||||||
|
│ │ ├── js/ ✅
|
||||||
|
│ │ └── images/ ✅
|
||||||
|
│ └── utils/ (helpers, decorators) ✅
|
||||||
|
│ ├── __init__.py ✅
|
||||||
|
│ └── settings.py ✅
|
||||||
|
├── migrations/ (Alembic migrations) ✅
|
||||||
|
│ ├── env.py ✅
|
||||||
|
│ ├── script.py.mako ✅
|
||||||
|
│ └── versions/ ✅
|
||||||
|
│ └── 001_initial_schema.py ✅
|
||||||
|
├── alembic.ini ✅
|
||||||
├── configs/ (existing)
|
├── configs/ (existing)
|
||||||
├── output/ (existing)
|
├── output/ (existing)
|
||||||
└── templates/ (existing - for reports)
|
└── templates/ (existing - for reports)
|
||||||
```
|
```
|
||||||
6. Create `requirements-web.txt` for Flask dependencies
|
6. ✅ Create `requirements-web.txt` for Flask dependencies
|
||||||
7. Update Dockerfile to support Flask app
|
7. ✅ Update Dockerfile to support Flask app
|
||||||
|
8. ✅ Create `docker-compose-web.yml` for web deployment
|
||||||
|
9. ✅ Create `validate_phase1.py` for verification
|
||||||
|
|
||||||
**Deliverables:**
|
**Deliverables:**
|
||||||
- Working database with schema
|
- ✅ Working database with schema (SQLite3 + Alembic migrations)
|
||||||
- Settings CRUD functionality
|
- ✅ Settings CRUD functionality (with encryption for sensitive values)
|
||||||
- Flask app skeleton (no UI yet)
|
- ✅ Flask app skeleton with functional Settings API
|
||||||
- Database migration system
|
- ✅ Database migration system (Alembic)
|
||||||
|
- ✅ API blueprint stubs (scans, schedules, alerts, settings)
|
||||||
|
- ✅ Docker support (Dockerfile updated, docker-compose-web.yml created)
|
||||||
|
|
||||||
**Testing:**
|
**Testing:**
|
||||||
- Database creates successfully
|
- ✅ Database creates successfully (`init_db.py` works)
|
||||||
- Settings can be stored/retrieved
|
- ✅ Settings can be stored/retrieved (encryption working)
|
||||||
- Flask app starts without errors
|
- ✅ Flask app starts without errors (`python3 -m web.app` works)
|
||||||
|
- ✅ All validation checks pass (`validate_phase1.py` ✓)
|
||||||
|
- ✅ All 11 database models defined correctly
|
||||||
|
- ✅ Settings API endpoints functional and tested
|
||||||
|
|
||||||
|
**Documentation:**
|
||||||
|
- ✅ `PHASE1_COMPLETE.md` - Complete Phase 1 summary with API reference and deployment guide
|
||||||
|
- ✅ `validate_phase1.py` - Automated validation script
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -785,7 +824,15 @@ All API endpoints return JSON and follow RESTful conventions.
|
|||||||
|
|
||||||
## Success Metrics
|
## Success Metrics
|
||||||
|
|
||||||
### Phase 1-3 Success
|
### Phase 1 Success ✅ ACHIEVED
|
||||||
|
- [x] Database creates successfully with all 11 tables
|
||||||
|
- [x] Settings can be stored/retrieved with encryption
|
||||||
|
- [x] Flask app starts without errors
|
||||||
|
- [x] API blueprints load correctly
|
||||||
|
- [x] All Python modules have valid syntax
|
||||||
|
- [x] Docker deployment configured
|
||||||
|
|
||||||
|
### Phase 2-3 Success (In Progress)
|
||||||
- [ ] Database stores scan results correctly
|
- [ ] Database stores scan results correctly
|
||||||
- [ ] Dashboard displays scans and trends
|
- [ ] Dashboard displays scans and trends
|
||||||
- [ ] Scheduled scans execute automatically
|
- [ ] Scheduled scans execute automatically
|
||||||
@@ -844,8 +891,9 @@ All API endpoints return JSON and follow RESTful conventions.
|
|||||||
| Date | Version | Changes |
|
| Date | Version | Changes |
|
||||||
|------|---------|---------|
|
|------|---------|---------|
|
||||||
| 2025-11-14 | 1.0 | Initial roadmap created based on user requirements |
|
| 2025-11-14 | 1.0 | Initial roadmap created based on user requirements |
|
||||||
|
| 2025-11-13 | 1.1 | **Phase 1 COMPLETE** - Database schema, SQLAlchemy models, Flask app structure, settings system with encryption, Alembic migrations, API blueprints, Docker support, validation script |
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
**Last Updated:** 2025-11-14
|
**Last Updated:** 2025-11-13
|
||||||
**Next Review:** After Phase 1 completion
|
**Next Review:** Before Phase 2 kickoff (REST API for scans implementation)
|
||||||
|
|||||||
114
alembic.ini
Normal file
114
alembic.ini
Normal file
@@ -0,0 +1,114 @@
|
|||||||
|
# A generic, single database configuration.
|
||||||
|
|
||||||
|
[alembic]
|
||||||
|
# path to migration scripts
|
||||||
|
script_location = migrations
|
||||||
|
|
||||||
|
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
|
||||||
|
# Uncomment the line below if you want the files to be prepended with date and time
|
||||||
|
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
|
||||||
|
|
||||||
|
# sys.path path, will be prepended to sys.path if present.
|
||||||
|
# defaults to the current working directory.
|
||||||
|
prepend_sys_path = .
|
||||||
|
|
||||||
|
# timezone to use when rendering the date within the migration file
|
||||||
|
# as well as the filename.
|
||||||
|
# If specified, requires the python-dateutil library that can be
|
||||||
|
# installed by adding `alembic[tz]` to the pip requirements
|
||||||
|
# string value is passed to dateutil.tz.gettz()
|
||||||
|
# leave blank for localtime
|
||||||
|
# timezone =
|
||||||
|
|
||||||
|
# max length of characters to apply to the
|
||||||
|
# "slug" field
|
||||||
|
# truncate_slug_length = 40
|
||||||
|
|
||||||
|
# set to 'true' to run the environment during
|
||||||
|
# the 'revision' command, regardless of autogenerate
|
||||||
|
# revision_environment = false
|
||||||
|
|
||||||
|
# set to 'true' to allow .pyc and .pyo files without
|
||||||
|
# a source .py file to be detected as revisions in the
|
||||||
|
# versions/ directory
|
||||||
|
# sourceless = false
|
||||||
|
|
||||||
|
# version location specification; This defaults
|
||||||
|
# to migrations/versions. When using multiple version
|
||||||
|
# directories, initial revisions must be specified with --version-path.
|
||||||
|
# The path separator used here should be the separator specified by "version_path_separator" below.
|
||||||
|
# version_locations = %(here)s/bar:%(here)s/bat:migrations/versions
|
||||||
|
|
||||||
|
# version path separator; As mentioned above, this is the character used to split
|
||||||
|
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
|
||||||
|
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
|
||||||
|
# Valid values for version_path_separator are:
|
||||||
|
#
|
||||||
|
# version_path_separator = :
|
||||||
|
# version_path_separator = ;
|
||||||
|
# version_path_separator = space
|
||||||
|
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
|
||||||
|
|
||||||
|
# set to 'true' to search source files recursively
|
||||||
|
# in each "version_locations" directory
|
||||||
|
# new in Alembic version 1.10
|
||||||
|
# recursive_version_locations = false
|
||||||
|
|
||||||
|
# the output encoding used when revision files
|
||||||
|
# are written from script.py.mako
|
||||||
|
# output_encoding = utf-8
|
||||||
|
|
||||||
|
sqlalchemy.url = sqlite:///./sneakyscanner.db
|
||||||
|
|
||||||
|
|
||||||
|
[post_write_hooks]
|
||||||
|
# post_write_hooks defines scripts or Python functions that are run
|
||||||
|
# on newly generated revision scripts. See the documentation for further
|
||||||
|
# detail and examples
|
||||||
|
|
||||||
|
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||||
|
# hooks = black
|
||||||
|
# black.type = console_scripts
|
||||||
|
# black.entrypoint = black
|
||||||
|
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||||
|
|
||||||
|
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
|
||||||
|
# hooks = ruff
|
||||||
|
# ruff.type = exec
|
||||||
|
# ruff.executable = %(here)s/.venv/bin/ruff
|
||||||
|
# ruff.options = --fix REVISION_SCRIPT_FILENAME
|
||||||
|
|
||||||
|
# Logging configuration
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARN
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARN
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
||||||
53
docker-compose-web.yml
Normal file
53
docker-compose-web.yml
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
web:
|
||||||
|
build: .
|
||||||
|
image: sneakyscanner:latest
|
||||||
|
container_name: sneakyscanner-web
|
||||||
|
# Override entrypoint to run Flask app instead of scanner
|
||||||
|
entrypoint: ["python3", "-u"]
|
||||||
|
command: ["-m", "web.app"]
|
||||||
|
ports:
|
||||||
|
- "5000:5000"
|
||||||
|
volumes:
|
||||||
|
# Mount configs directory (read-only) for scan configurations
|
||||||
|
- ./configs:/app/configs:ro
|
||||||
|
# Mount output directory for scan results
|
||||||
|
- ./output:/app/output
|
||||||
|
# Mount database file for persistence
|
||||||
|
- ./data:/app/data
|
||||||
|
# Mount logs directory
|
||||||
|
- ./logs:/app/logs
|
||||||
|
environment:
|
||||||
|
# Flask configuration
|
||||||
|
- FLASK_APP=web.app
|
||||||
|
- FLASK_ENV=development
|
||||||
|
- FLASK_DEBUG=true
|
||||||
|
- FLASK_HOST=0.0.0.0
|
||||||
|
- FLASK_PORT=5000
|
||||||
|
# Database configuration (SQLite in mounted volume for persistence)
|
||||||
|
- DATABASE_URL=sqlite:////app/data/sneakyscanner.db
|
||||||
|
# Security settings
|
||||||
|
- SECRET_KEY=${SECRET_KEY:-dev-secret-key-change-in-production}
|
||||||
|
# Optional: CORS origins (comma-separated)
|
||||||
|
- CORS_ORIGINS=${CORS_ORIGINS:-*}
|
||||||
|
# Optional: Logging level
|
||||||
|
- LOG_LEVEL=${LOG_LEVEL:-INFO}
|
||||||
|
# Note: Scanner functionality requires privileged mode and host network
|
||||||
|
# For now, the web app will trigger scans via subprocess
|
||||||
|
# In Phase 2, we'll integrate scanner properly
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
# Optional: Initialize database on first run
|
||||||
|
# Run with: docker-compose -f docker-compose-web.yml run --rm init-db
|
||||||
|
init-db:
|
||||||
|
build: .
|
||||||
|
image: sneakyscanner:latest
|
||||||
|
container_name: sneakyscanner-init-db
|
||||||
|
entrypoint: ["python3"]
|
||||||
|
command: ["init_db.py", "--db-url", "sqlite:////app/data/sneakyscanner.db"]
|
||||||
|
volumes:
|
||||||
|
- ./data:/app/data
|
||||||
|
profiles:
|
||||||
|
- tools
|
||||||
235
init_db.py
Executable file
235
init_db.py
Executable file
@@ -0,0 +1,235 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Database initialization script for SneakyScanner.
|
||||||
|
|
||||||
|
This script:
|
||||||
|
1. Creates the database schema using Alembic migrations
|
||||||
|
2. Initializes default settings
|
||||||
|
3. Optionally sets up an initial admin password
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python3 init_db.py [--password PASSWORD]
|
||||||
|
"""
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Add project root to path
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent))
|
||||||
|
|
||||||
|
from alembic import command
|
||||||
|
from alembic.config import Config
|
||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
|
from web.models import Base
|
||||||
|
from web.utils.settings import PasswordManager, SettingsManager
|
||||||
|
|
||||||
|
|
||||||
|
def init_database(db_url: str = "sqlite:///./sneakyscanner.db", run_migrations: bool = True):
|
||||||
|
"""
|
||||||
|
Initialize the database schema and settings.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_url: Database URL (defaults to SQLite in current directory)
|
||||||
|
run_migrations: Whether to run Alembic migrations (True) or create all tables directly (False)
|
||||||
|
"""
|
||||||
|
print(f"Initializing SneakyScanner database at: {db_url}")
|
||||||
|
|
||||||
|
# Create database directory if it doesn't exist (for SQLite)
|
||||||
|
if db_url.startswith('sqlite:///'):
|
||||||
|
db_path = db_url.replace('sqlite:///', '')
|
||||||
|
db_dir = Path(db_path).parent
|
||||||
|
if not db_dir.exists():
|
||||||
|
print(f"Creating database directory: {db_dir}")
|
||||||
|
db_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
if run_migrations:
|
||||||
|
# Run Alembic migrations
|
||||||
|
print("Running Alembic migrations...")
|
||||||
|
alembic_cfg = Config("alembic.ini")
|
||||||
|
alembic_cfg.set_main_option("sqlalchemy.url", db_url)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Upgrade to head (latest migration)
|
||||||
|
command.upgrade(alembic_cfg, "head")
|
||||||
|
print("✓ Database schema created successfully via migrations")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"✗ Migration failed: {e}")
|
||||||
|
print("Falling back to direct table creation...")
|
||||||
|
run_migrations = False
|
||||||
|
|
||||||
|
if not run_migrations:
|
||||||
|
# Create tables directly using SQLAlchemy (fallback or if migrations disabled)
|
||||||
|
print("Creating database schema directly...")
|
||||||
|
engine = create_engine(db_url, echo=False)
|
||||||
|
Base.metadata.create_all(engine)
|
||||||
|
print("✓ Database schema created successfully")
|
||||||
|
|
||||||
|
# Initialize settings
|
||||||
|
print("\nInitializing default settings...")
|
||||||
|
engine = create_engine(db_url, echo=False)
|
||||||
|
Session = sessionmaker(bind=engine)
|
||||||
|
session = Session()
|
||||||
|
|
||||||
|
try:
|
||||||
|
settings_manager = SettingsManager(session)
|
||||||
|
settings_manager.init_defaults()
|
||||||
|
print("✓ Default settings initialized")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"✗ Failed to initialize settings: {e}")
|
||||||
|
session.rollback()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
|
||||||
|
print("\n✓ Database initialization complete!")
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def set_password(db_url: str, password: str):
|
||||||
|
"""
|
||||||
|
Set the application password.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_url: Database URL
|
||||||
|
password: Password to set
|
||||||
|
"""
|
||||||
|
print("Setting application password...")
|
||||||
|
|
||||||
|
engine = create_engine(db_url, echo=False)
|
||||||
|
Session = sessionmaker(bind=engine)
|
||||||
|
session = Session()
|
||||||
|
|
||||||
|
try:
|
||||||
|
settings_manager = SettingsManager(session)
|
||||||
|
PasswordManager.set_app_password(settings_manager, password)
|
||||||
|
print("✓ Password set successfully")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"✗ Failed to set password: {e}")
|
||||||
|
session.rollback()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
|
||||||
|
|
||||||
|
def verify_database(db_url: str):
|
||||||
|
"""
|
||||||
|
Verify database schema and settings.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_url: Database URL
|
||||||
|
"""
|
||||||
|
print("\nVerifying database...")
|
||||||
|
|
||||||
|
engine = create_engine(db_url, echo=False)
|
||||||
|
Session = sessionmaker(bind=engine)
|
||||||
|
session = Session()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if tables exist by querying settings
|
||||||
|
from web.models import Setting
|
||||||
|
count = session.query(Setting).count()
|
||||||
|
print(f"✓ Settings table accessible ({count} settings found)")
|
||||||
|
|
||||||
|
# Display current settings (sanitized)
|
||||||
|
settings_manager = SettingsManager(session)
|
||||||
|
settings = settings_manager.get_all(decrypt=False, sanitize=True)
|
||||||
|
print("\nCurrent settings:")
|
||||||
|
for key, value in sorted(settings.items()):
|
||||||
|
print(f" {key}: {value}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"✗ Database verification failed: {e}")
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Main entry point for database initialization."""
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Initialize SneakyScanner database",
|
||||||
|
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||||
|
epilog="""
|
||||||
|
Examples:
|
||||||
|
# Initialize database with default settings
|
||||||
|
python3 init_db.py
|
||||||
|
|
||||||
|
# Initialize and set password
|
||||||
|
python3 init_db.py --password mysecretpassword
|
||||||
|
|
||||||
|
# Use custom database URL
|
||||||
|
python3 init_db.py --db-url postgresql://user:pass@localhost/sneakyscanner
|
||||||
|
|
||||||
|
# Verify existing database
|
||||||
|
python3 init_db.py --verify-only
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
parser.add_argument(
|
||||||
|
'--db-url',
|
||||||
|
default='sqlite:///./sneakyscanner.db',
|
||||||
|
help='Database URL (default: sqlite:///./sneakyscanner.db)'
|
||||||
|
)
|
||||||
|
|
||||||
|
parser.add_argument(
|
||||||
|
'--password',
|
||||||
|
help='Set application password'
|
||||||
|
)
|
||||||
|
|
||||||
|
parser.add_argument(
|
||||||
|
'--verify-only',
|
||||||
|
action='store_true',
|
||||||
|
help='Only verify database, do not initialize'
|
||||||
|
)
|
||||||
|
|
||||||
|
parser.add_argument(
|
||||||
|
'--no-migrations',
|
||||||
|
action='store_true',
|
||||||
|
help='Create tables directly instead of using migrations'
|
||||||
|
)
|
||||||
|
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
# Check if database already exists
|
||||||
|
db_exists = False
|
||||||
|
if args.db_url.startswith('sqlite:///'):
|
||||||
|
db_path = args.db_url.replace('sqlite:///', '')
|
||||||
|
db_exists = Path(db_path).exists()
|
||||||
|
|
||||||
|
if db_exists and not args.verify_only:
|
||||||
|
response = input(f"\nDatabase already exists at {db_path}. Reinitialize? (y/N): ")
|
||||||
|
if response.lower() != 'y':
|
||||||
|
print("Aborting.")
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
if args.verify_only:
|
||||||
|
verify_database(args.db_url)
|
||||||
|
else:
|
||||||
|
# Initialize database
|
||||||
|
init_database(args.db_url, run_migrations=not args.no_migrations)
|
||||||
|
|
||||||
|
# Set password if provided
|
||||||
|
if args.password:
|
||||||
|
set_password(args.db_url, args.password)
|
||||||
|
|
||||||
|
# Verify
|
||||||
|
verify_database(args.db_url)
|
||||||
|
|
||||||
|
print("\n✓ All done! Database is ready to use.")
|
||||||
|
|
||||||
|
if not args.password and not args.verify_only:
|
||||||
|
print("\n⚠ WARNING: No password set. Run with --password to set one:")
|
||||||
|
print(f" python3 init_db.py --db-url {args.db_url} --password YOUR_PASSWORD")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"\n✗ Initialization failed: {e}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
83
migrations/env.py
Normal file
83
migrations/env.py
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
"""Alembic migration environment for SneakyScanner."""
|
||||||
|
|
||||||
|
from logging.config import fileConfig
|
||||||
|
|
||||||
|
from sqlalchemy import engine_from_config
|
||||||
|
from sqlalchemy import pool
|
||||||
|
|
||||||
|
from alembic import context
|
||||||
|
|
||||||
|
# Import all models to ensure they're registered with Base
|
||||||
|
from web.models import Base
|
||||||
|
|
||||||
|
# this is the Alembic Config object, which provides
|
||||||
|
# access to the values within the .ini file in use.
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Interpret the config file for Python logging.
|
||||||
|
# This line sets up loggers basically.
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
# add your model's MetaData object here
|
||||||
|
# for 'autogenerate' support
|
||||||
|
# from myapp import mymodel
|
||||||
|
# target_metadata = mymodel.Base.metadata
|
||||||
|
target_metadata = Base.metadata
|
||||||
|
|
||||||
|
# other values from the config, defined by the needs of env.py,
|
||||||
|
# can be acquired:
|
||||||
|
# my_important_option = config.get_main_option("my_important_option")
|
||||||
|
# ... etc.
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""Run migrations in 'offline' mode.
|
||||||
|
|
||||||
|
This configures the context with just a URL
|
||||||
|
and not an Engine, though an Engine is acceptable
|
||||||
|
here as well. By skipping the Engine creation
|
||||||
|
we don't even need a DBAPI to be available.
|
||||||
|
|
||||||
|
Calls to context.execute() here emit the given string to the
|
||||||
|
script output.
|
||||||
|
|
||||||
|
"""
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""Run migrations in 'online' mode.
|
||||||
|
|
||||||
|
In this scenario we need to create an Engine
|
||||||
|
and associate a connection with the context.
|
||||||
|
|
||||||
|
"""
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section, {}),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
context.configure(
|
||||||
|
connection=connection, target_metadata=target_metadata
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
||||||
24
migrations/script.py.mako
Normal file
24
migrations/script.py.mako
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
"""${message}
|
||||||
|
|
||||||
|
Revision ID: ${up_revision}
|
||||||
|
Revises: ${down_revision | comma,n}
|
||||||
|
Create Date: ${create_date}
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
${imports if imports else ""}
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = ${repr(up_revision)}
|
||||||
|
down_revision = ${repr(down_revision)}
|
||||||
|
branch_labels = ${repr(branch_labels)}
|
||||||
|
depends_on = ${repr(depends_on)}
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
${upgrades if upgrades else "pass"}
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
${downgrades if downgrades else "pass"}
|
||||||
221
migrations/versions/001_initial_schema.py
Normal file
221
migrations/versions/001_initial_schema.py
Normal file
@@ -0,0 +1,221 @@
|
|||||||
|
"""Initial database schema for SneakyScanner
|
||||||
|
|
||||||
|
Revision ID: 001
|
||||||
|
Revises:
|
||||||
|
Create Date: 2025-11-13 18:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '001'
|
||||||
|
down_revision = None
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
"""Create all initial tables for SneakyScanner."""
|
||||||
|
|
||||||
|
# Create schedules table first (referenced by scans)
|
||||||
|
op.create_table('schedules',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('name', sa.String(length=255), nullable=False, comment='Schedule name (e.g., \'Daily prod scan\')'),
|
||||||
|
sa.Column('config_file', sa.Text(), nullable=False, comment='Path to YAML config'),
|
||||||
|
sa.Column('cron_expression', sa.String(length=100), nullable=False, comment='Cron-like schedule (e.g., \'0 2 * * *\')'),
|
||||||
|
sa.Column('enabled', sa.Boolean(), nullable=False, comment='Is schedule active?'),
|
||||||
|
sa.Column('last_run', sa.DateTime(), nullable=True, comment='Last execution time'),
|
||||||
|
sa.Column('next_run', sa.DateTime(), nullable=True, comment='Next scheduled execution'),
|
||||||
|
sa.Column('created_at', sa.DateTime(), nullable=False, comment='Schedule creation time'),
|
||||||
|
sa.Column('updated_at', sa.DateTime(), nullable=False, comment='Last modification time'),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create scans table
|
||||||
|
op.create_table('scans',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('timestamp', sa.DateTime(), nullable=False, comment='Scan start time (UTC)'),
|
||||||
|
sa.Column('duration', sa.Float(), nullable=True, comment='Total scan duration in seconds'),
|
||||||
|
sa.Column('status', sa.String(length=20), nullable=False, comment='running, completed, failed'),
|
||||||
|
sa.Column('config_file', sa.Text(), nullable=True, comment='Path to YAML config used'),
|
||||||
|
sa.Column('title', sa.Text(), nullable=True, comment='Scan title from config'),
|
||||||
|
sa.Column('json_path', sa.Text(), nullable=True, comment='Path to JSON report'),
|
||||||
|
sa.Column('html_path', sa.Text(), nullable=True, comment='Path to HTML report'),
|
||||||
|
sa.Column('zip_path', sa.Text(), nullable=True, comment='Path to ZIP archive'),
|
||||||
|
sa.Column('screenshot_dir', sa.Text(), nullable=True, comment='Path to screenshot directory'),
|
||||||
|
sa.Column('created_at', sa.DateTime(), nullable=False, comment='Record creation time'),
|
||||||
|
sa.Column('triggered_by', sa.String(length=50), nullable=False, comment='manual, scheduled, api'),
|
||||||
|
sa.Column('schedule_id', sa.Integer(), nullable=True, comment='FK to schedules if triggered by schedule'),
|
||||||
|
sa.ForeignKeyConstraint(['schedule_id'], ['schedules.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_scans_timestamp'), 'scans', ['timestamp'], unique=False)
|
||||||
|
|
||||||
|
# Create scan_sites table
|
||||||
|
op.create_table('scan_sites',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('scan_id', sa.Integer(), nullable=False, comment='FK to scans'),
|
||||||
|
sa.Column('site_name', sa.String(length=255), nullable=False, comment='Site name from config'),
|
||||||
|
sa.ForeignKeyConstraint(['scan_id'], ['scans.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_scan_sites_scan_id'), 'scan_sites', ['scan_id'], unique=False)
|
||||||
|
|
||||||
|
# Create scan_ips table
|
||||||
|
op.create_table('scan_ips',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('scan_id', sa.Integer(), nullable=False, comment='FK to scans'),
|
||||||
|
sa.Column('site_id', sa.Integer(), nullable=False, comment='FK to scan_sites'),
|
||||||
|
sa.Column('ip_address', sa.String(length=45), nullable=False, comment='IPv4 or IPv6 address'),
|
||||||
|
sa.Column('ping_expected', sa.Boolean(), nullable=True, comment='Expected ping response'),
|
||||||
|
sa.Column('ping_actual', sa.Boolean(), nullable=True, comment='Actual ping response'),
|
||||||
|
sa.ForeignKeyConstraint(['scan_id'], ['scans.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['site_id'], ['scan_sites.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
sa.UniqueConstraint('scan_id', 'ip_address', name='uix_scan_ip')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_scan_ips_scan_id'), 'scan_ips', ['scan_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_scan_ips_site_id'), 'scan_ips', ['site_id'], unique=False)
|
||||||
|
|
||||||
|
# Create scan_ports table
|
||||||
|
op.create_table('scan_ports',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('scan_id', sa.Integer(), nullable=False, comment='FK to scans'),
|
||||||
|
sa.Column('ip_id', sa.Integer(), nullable=False, comment='FK to scan_ips'),
|
||||||
|
sa.Column('port', sa.Integer(), nullable=False, comment='Port number (1-65535)'),
|
||||||
|
sa.Column('protocol', sa.String(length=10), nullable=False, comment='tcp or udp'),
|
||||||
|
sa.Column('expected', sa.Boolean(), nullable=True, comment='Was this port expected?'),
|
||||||
|
sa.Column('state', sa.String(length=20), nullable=False, comment='open, closed, filtered'),
|
||||||
|
sa.ForeignKeyConstraint(['ip_id'], ['scan_ips.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['scan_id'], ['scans.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
sa.UniqueConstraint('scan_id', 'ip_id', 'port', 'protocol', name='uix_scan_ip_port')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_scan_ports_ip_id'), 'scan_ports', ['ip_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_scan_ports_scan_id'), 'scan_ports', ['scan_id'], unique=False)
|
||||||
|
|
||||||
|
# Create scan_services table
|
||||||
|
op.create_table('scan_services',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('scan_id', sa.Integer(), nullable=False, comment='FK to scans'),
|
||||||
|
sa.Column('port_id', sa.Integer(), nullable=False, comment='FK to scan_ports'),
|
||||||
|
sa.Column('service_name', sa.String(length=100), nullable=True, comment='Service name (e.g., ssh, http)'),
|
||||||
|
sa.Column('product', sa.String(length=255), nullable=True, comment='Product name (e.g., OpenSSH)'),
|
||||||
|
sa.Column('version', sa.String(length=100), nullable=True, comment='Version string'),
|
||||||
|
sa.Column('extrainfo', sa.Text(), nullable=True, comment='Additional nmap info'),
|
||||||
|
sa.Column('ostype', sa.String(length=100), nullable=True, comment='OS type if detected'),
|
||||||
|
sa.Column('http_protocol', sa.String(length=10), nullable=True, comment='http or https (if web service)'),
|
||||||
|
sa.Column('screenshot_path', sa.Text(), nullable=True, comment='Relative path to screenshot'),
|
||||||
|
sa.ForeignKeyConstraint(['port_id'], ['scan_ports.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['scan_id'], ['scans.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_scan_services_port_id'), 'scan_services', ['port_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_scan_services_scan_id'), 'scan_services', ['scan_id'], unique=False)
|
||||||
|
|
||||||
|
# Create scan_certificates table
|
||||||
|
op.create_table('scan_certificates',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('scan_id', sa.Integer(), nullable=False, comment='FK to scans'),
|
||||||
|
sa.Column('service_id', sa.Integer(), nullable=False, comment='FK to scan_services'),
|
||||||
|
sa.Column('subject', sa.Text(), nullable=True, comment='Certificate subject (CN)'),
|
||||||
|
sa.Column('issuer', sa.Text(), nullable=True, comment='Certificate issuer'),
|
||||||
|
sa.Column('serial_number', sa.Text(), nullable=True, comment='Serial number'),
|
||||||
|
sa.Column('not_valid_before', sa.DateTime(), nullable=True, comment='Validity start date'),
|
||||||
|
sa.Column('not_valid_after', sa.DateTime(), nullable=True, comment='Validity end date'),
|
||||||
|
sa.Column('days_until_expiry', sa.Integer(), nullable=True, comment='Days until expiration'),
|
||||||
|
sa.Column('sans', sa.Text(), nullable=True, comment='JSON array of SANs'),
|
||||||
|
sa.Column('is_self_signed', sa.Boolean(), nullable=True, comment='Self-signed certificate flag'),
|
||||||
|
sa.ForeignKeyConstraint(['scan_id'], ['scans.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['service_id'], ['scan_services.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
comment='Index on expiration date for alert queries'
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_scan_certificates_scan_id'), 'scan_certificates', ['scan_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_scan_certificates_service_id'), 'scan_certificates', ['service_id'], unique=False)
|
||||||
|
|
||||||
|
# Create scan_tls_versions table
|
||||||
|
op.create_table('scan_tls_versions',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('scan_id', sa.Integer(), nullable=False, comment='FK to scans'),
|
||||||
|
sa.Column('certificate_id', sa.Integer(), nullable=False, comment='FK to scan_certificates'),
|
||||||
|
sa.Column('tls_version', sa.String(length=20), nullable=False, comment='TLS 1.0, TLS 1.1, TLS 1.2, TLS 1.3'),
|
||||||
|
sa.Column('supported', sa.Boolean(), nullable=False, comment='Is this version supported?'),
|
||||||
|
sa.Column('cipher_suites', sa.Text(), nullable=True, comment='JSON array of cipher suites'),
|
||||||
|
sa.ForeignKeyConstraint(['certificate_id'], ['scan_certificates.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['scan_id'], ['scans.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_scan_tls_versions_certificate_id'), 'scan_tls_versions', ['certificate_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_scan_tls_versions_scan_id'), 'scan_tls_versions', ['scan_id'], unique=False)
|
||||||
|
|
||||||
|
# Create alerts table
|
||||||
|
op.create_table('alerts',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('scan_id', sa.Integer(), nullable=False, comment='FK to scans'),
|
||||||
|
sa.Column('alert_type', sa.String(length=50), nullable=False, comment='new_port, cert_expiry, service_change, ping_failed'),
|
||||||
|
sa.Column('severity', sa.String(length=20), nullable=False, comment='info, warning, critical'),
|
||||||
|
sa.Column('message', sa.Text(), nullable=False, comment='Human-readable alert message'),
|
||||||
|
sa.Column('ip_address', sa.String(length=45), nullable=True, comment='Related IP (optional)'),
|
||||||
|
sa.Column('port', sa.Integer(), nullable=True, comment='Related port (optional)'),
|
||||||
|
sa.Column('email_sent', sa.Boolean(), nullable=False, comment='Was email notification sent?'),
|
||||||
|
sa.Column('email_sent_at', sa.DateTime(), nullable=True, comment='Email send timestamp'),
|
||||||
|
sa.Column('created_at', sa.DateTime(), nullable=False, comment='Alert creation time'),
|
||||||
|
sa.ForeignKeyConstraint(['scan_id'], ['scans.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
comment='Indexes for alert filtering'
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_alerts_scan_id'), 'alerts', ['scan_id'], unique=False)
|
||||||
|
|
||||||
|
# Create alert_rules table
|
||||||
|
op.create_table('alert_rules',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('rule_type', sa.String(length=50), nullable=False, comment='unexpected_port, cert_expiry, service_down, etc.'),
|
||||||
|
sa.Column('enabled', sa.Boolean(), nullable=False, comment='Is rule active?'),
|
||||||
|
sa.Column('threshold', sa.Integer(), nullable=True, comment='Threshold value (e.g., days for cert expiry)'),
|
||||||
|
sa.Column('email_enabled', sa.Boolean(), nullable=False, comment='Send email for this rule?'),
|
||||||
|
sa.Column('created_at', sa.DateTime(), nullable=False, comment='Rule creation time'),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create settings table
|
||||||
|
op.create_table('settings',
|
||||||
|
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||||
|
sa.Column('key', sa.String(length=255), nullable=False, comment='Setting key (e.g., smtp_server)'),
|
||||||
|
sa.Column('value', sa.Text(), nullable=True, comment='Setting value (JSON for complex values)'),
|
||||||
|
sa.Column('updated_at', sa.DateTime(), nullable=False, comment='Last modification time'),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
sa.UniqueConstraint('key')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_settings_key'), 'settings', ['key'], unique=True)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
"""Drop all tables."""
|
||||||
|
op.drop_index(op.f('ix_settings_key'), table_name='settings')
|
||||||
|
op.drop_table('settings')
|
||||||
|
op.drop_table('alert_rules')
|
||||||
|
op.drop_index(op.f('ix_alerts_scan_id'), table_name='alerts')
|
||||||
|
op.drop_table('alerts')
|
||||||
|
op.drop_index(op.f('ix_scan_tls_versions_scan_id'), table_name='scan_tls_versions')
|
||||||
|
op.drop_index(op.f('ix_scan_tls_versions_certificate_id'), table_name='scan_tls_versions')
|
||||||
|
op.drop_table('scan_tls_versions')
|
||||||
|
op.drop_index(op.f('ix_scan_certificates_service_id'), table_name='scan_certificates')
|
||||||
|
op.drop_index(op.f('ix_scan_certificates_scan_id'), table_name='scan_certificates')
|
||||||
|
op.drop_table('scan_certificates')
|
||||||
|
op.drop_index(op.f('ix_scan_services_scan_id'), table_name='scan_services')
|
||||||
|
op.drop_index(op.f('ix_scan_services_port_id'), table_name='scan_services')
|
||||||
|
op.drop_table('scan_services')
|
||||||
|
op.drop_index(op.f('ix_scan_ports_scan_id'), table_name='scan_ports')
|
||||||
|
op.drop_index(op.f('ix_scan_ports_ip_id'), table_name='scan_ports')
|
||||||
|
op.drop_table('scan_ports')
|
||||||
|
op.drop_index(op.f('ix_scan_ips_site_id'), table_name='scan_ips')
|
||||||
|
op.drop_index(op.f('ix_scan_ips_scan_id'), table_name='scan_ips')
|
||||||
|
op.drop_table('scan_ips')
|
||||||
|
op.drop_index(op.f('ix_scan_sites_scan_id'), table_name='scan_sites')
|
||||||
|
op.drop_table('scan_sites')
|
||||||
|
op.drop_index(op.f('ix_scans_timestamp'), table_name='scans')
|
||||||
|
op.drop_table('scans')
|
||||||
|
op.drop_table('schedules')
|
||||||
33
requirements-web.txt
Normal file
33
requirements-web.txt
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
# Flask Web Application Dependencies
|
||||||
|
# Phase 1: Foundation (Database, Settings, Flask Core)
|
||||||
|
|
||||||
|
# Core Flask
|
||||||
|
Flask==3.0.0
|
||||||
|
Werkzeug==3.0.1
|
||||||
|
|
||||||
|
# Database & ORM
|
||||||
|
SQLAlchemy==2.0.23
|
||||||
|
alembic==1.13.0
|
||||||
|
|
||||||
|
# Authentication & Security
|
||||||
|
Flask-Login==0.6.3
|
||||||
|
bcrypt==4.1.2
|
||||||
|
cryptography==41.0.7
|
||||||
|
|
||||||
|
# API & Serialization
|
||||||
|
Flask-CORS==4.0.0
|
||||||
|
marshmallow==3.20.1
|
||||||
|
marshmallow-sqlalchemy==0.29.0
|
||||||
|
|
||||||
|
# Background Jobs & Scheduling
|
||||||
|
APScheduler==3.10.4
|
||||||
|
|
||||||
|
# Email Support (Phase 4)
|
||||||
|
Flask-Mail==0.9.1
|
||||||
|
|
||||||
|
# Configuration Management
|
||||||
|
python-dotenv==1.0.0
|
||||||
|
|
||||||
|
# Development & Testing
|
||||||
|
pytest==7.4.3
|
||||||
|
pytest-flask==1.3.0
|
||||||
197
validate_phase1.py
Executable file
197
validate_phase1.py
Executable file
@@ -0,0 +1,197 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Phase 1 validation script.
|
||||||
|
|
||||||
|
Validates that all Phase 1 deliverables are in place and code structure is correct.
|
||||||
|
Does not require dependencies to be installed.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import ast
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
|
||||||
|
def validate_file_exists(file_path, description):
|
||||||
|
"""Check if a file exists."""
|
||||||
|
if Path(file_path).exists():
|
||||||
|
print(f"✓ {description}: {file_path}")
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
print(f"✗ {description} missing: {file_path}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def validate_directory_exists(dir_path, description):
|
||||||
|
"""Check if a directory exists."""
|
||||||
|
if Path(dir_path).is_dir():
|
||||||
|
print(f"✓ {description}: {dir_path}")
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
print(f"✗ {description} missing: {dir_path}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def validate_python_syntax(file_path):
|
||||||
|
"""Validate Python file syntax."""
|
||||||
|
try:
|
||||||
|
with open(file_path, 'r') as f:
|
||||||
|
ast.parse(f.read())
|
||||||
|
return True
|
||||||
|
except SyntaxError as e:
|
||||||
|
print(f" ✗ Syntax error in {file_path}: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Run all validation checks."""
|
||||||
|
print("=" * 70)
|
||||||
|
print("SneakyScanner Phase 1 Validation")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
all_passed = True
|
||||||
|
|
||||||
|
# Check project structure
|
||||||
|
print("\n1. Project Structure:")
|
||||||
|
print("-" * 70)
|
||||||
|
|
||||||
|
structure_checks = [
|
||||||
|
("web/", "Web application directory"),
|
||||||
|
("web/api/", "API blueprints directory"),
|
||||||
|
("web/templates/", "Jinja2 templates directory"),
|
||||||
|
("web/static/", "Static files directory"),
|
||||||
|
("web/utils/", "Utility modules directory"),
|
||||||
|
("migrations/", "Alembic migrations directory"),
|
||||||
|
("migrations/versions/", "Migration versions directory"),
|
||||||
|
]
|
||||||
|
|
||||||
|
for path, desc in structure_checks:
|
||||||
|
if not validate_directory_exists(path, desc):
|
||||||
|
all_passed = False
|
||||||
|
|
||||||
|
# Check core files
|
||||||
|
print("\n2. Core Files:")
|
||||||
|
print("-" * 70)
|
||||||
|
|
||||||
|
core_files = [
|
||||||
|
("requirements-web.txt", "Web dependencies"),
|
||||||
|
("alembic.ini", "Alembic configuration"),
|
||||||
|
("init_db.py", "Database initialization script"),
|
||||||
|
("docker-compose-web.yml", "Docker Compose for web app"),
|
||||||
|
]
|
||||||
|
|
||||||
|
for path, desc in core_files:
|
||||||
|
if not validate_file_exists(path, desc):
|
||||||
|
all_passed = False
|
||||||
|
|
||||||
|
# Check Python modules
|
||||||
|
print("\n3. Python Modules:")
|
||||||
|
print("-" * 70)
|
||||||
|
|
||||||
|
python_modules = [
|
||||||
|
("web/__init__.py", "Web package init"),
|
||||||
|
("web/models.py", "SQLAlchemy models"),
|
||||||
|
("web/app.py", "Flask application factory"),
|
||||||
|
("web/utils/__init__.py", "Utils package init"),
|
||||||
|
("web/utils/settings.py", "Settings manager"),
|
||||||
|
("web/api/__init__.py", "API package init"),
|
||||||
|
("web/api/scans.py", "Scans API blueprint"),
|
||||||
|
("web/api/schedules.py", "Schedules API blueprint"),
|
||||||
|
("web/api/alerts.py", "Alerts API blueprint"),
|
||||||
|
("web/api/settings.py", "Settings API blueprint"),
|
||||||
|
("migrations/env.py", "Alembic environment"),
|
||||||
|
("migrations/script.py.mako", "Migration template"),
|
||||||
|
("migrations/versions/001_initial_schema.py", "Initial migration"),
|
||||||
|
]
|
||||||
|
|
||||||
|
for path, desc in python_modules:
|
||||||
|
exists = validate_file_exists(path, desc)
|
||||||
|
if exists:
|
||||||
|
# Skip syntax check for .mako templates (they're not pure Python)
|
||||||
|
if not path.endswith('.mako'):
|
||||||
|
if not validate_python_syntax(path):
|
||||||
|
all_passed = False
|
||||||
|
else:
|
||||||
|
print(f" (Skipped syntax check for template file)")
|
||||||
|
else:
|
||||||
|
all_passed = False
|
||||||
|
|
||||||
|
# Check models
|
||||||
|
print("\n4. Database Models (from models.py):")
|
||||||
|
print("-" * 70)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Read models.py and look for class definitions
|
||||||
|
with open('web/models.py', 'r') as f:
|
||||||
|
content = f.read()
|
||||||
|
tree = ast.parse(content)
|
||||||
|
|
||||||
|
models = []
|
||||||
|
for node in ast.walk(tree):
|
||||||
|
if isinstance(node, ast.ClassDef) and node.name != 'Base':
|
||||||
|
models.append(node.name)
|
||||||
|
|
||||||
|
expected_models = [
|
||||||
|
'Scan', 'ScanSite', 'ScanIP', 'ScanPort', 'ScanService',
|
||||||
|
'ScanCertificate', 'ScanTLSVersion', 'Schedule', 'Alert',
|
||||||
|
'AlertRule', 'Setting'
|
||||||
|
]
|
||||||
|
|
||||||
|
for model in expected_models:
|
||||||
|
if model in models:
|
||||||
|
print(f"✓ Model defined: {model}")
|
||||||
|
else:
|
||||||
|
print(f"✗ Model missing: {model}")
|
||||||
|
all_passed = False
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"✗ Failed to parse models.py: {e}")
|
||||||
|
all_passed = False
|
||||||
|
|
||||||
|
# Check API endpoints
|
||||||
|
print("\n5. API Blueprints:")
|
||||||
|
print("-" * 70)
|
||||||
|
|
||||||
|
blueprints = {
|
||||||
|
'web/api/scans.py': ['list_scans', 'get_scan', 'trigger_scan', 'delete_scan'],
|
||||||
|
'web/api/schedules.py': ['list_schedules', 'get_schedule', 'create_schedule'],
|
||||||
|
'web/api/alerts.py': ['list_alerts', 'list_alert_rules'],
|
||||||
|
'web/api/settings.py': ['get_settings', 'update_settings'],
|
||||||
|
}
|
||||||
|
|
||||||
|
for blueprint_file, expected_funcs in blueprints.items():
|
||||||
|
try:
|
||||||
|
with open(blueprint_file, 'r') as f:
|
||||||
|
content = f.read()
|
||||||
|
tree = ast.parse(content)
|
||||||
|
|
||||||
|
functions = [node.name for node in ast.walk(tree) if isinstance(node, ast.FunctionDef)]
|
||||||
|
|
||||||
|
print(f"\n {blueprint_file}:")
|
||||||
|
for func in expected_funcs:
|
||||||
|
if func in functions:
|
||||||
|
print(f" ✓ Endpoint: {func}")
|
||||||
|
else:
|
||||||
|
print(f" ✗ Missing endpoint: {func}")
|
||||||
|
all_passed = False
|
||||||
|
except Exception as e:
|
||||||
|
print(f" ✗ Failed to parse {blueprint_file}: {e}")
|
||||||
|
all_passed = False
|
||||||
|
|
||||||
|
# Summary
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
if all_passed:
|
||||||
|
print("✓ All Phase 1 validation checks passed!")
|
||||||
|
print("\nNext steps:")
|
||||||
|
print("1. Install dependencies: pip install -r requirements-web.txt")
|
||||||
|
print("2. Initialize database: python3 init_db.py --password YOUR_PASSWORD")
|
||||||
|
print("3. Run Flask app: python3 -m web.app")
|
||||||
|
print("4. Test API: curl http://localhost:5000/api/settings/health")
|
||||||
|
return 0
|
||||||
|
else:
|
||||||
|
print("✗ Some validation checks failed. Please review errors above.")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.exit(main())
|
||||||
0
web/__init__.py
Normal file
0
web/__init__.py
Normal file
0
web/api/__init__.py
Normal file
0
web/api/__init__.py
Normal file
137
web/api/alerts.py
Normal file
137
web/api/alerts.py
Normal file
@@ -0,0 +1,137 @@
|
|||||||
|
"""
|
||||||
|
Alerts API blueprint.
|
||||||
|
|
||||||
|
Handles endpoints for viewing alert history and managing alert rules.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from flask import Blueprint, jsonify, request
|
||||||
|
|
||||||
|
bp = Blueprint('alerts', __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('', methods=['GET'])
|
||||||
|
def list_alerts():
|
||||||
|
"""
|
||||||
|
List recent alerts.
|
||||||
|
|
||||||
|
Query params:
|
||||||
|
page: Page number (default: 1)
|
||||||
|
per_page: Items per page (default: 20)
|
||||||
|
alert_type: Filter by alert type
|
||||||
|
severity: Filter by severity (info, warning, critical)
|
||||||
|
start_date: Filter alerts after this date
|
||||||
|
end_date: Filter alerts before this date
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with alerts list
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 4
|
||||||
|
return jsonify({
|
||||||
|
'alerts': [],
|
||||||
|
'total': 0,
|
||||||
|
'page': 1,
|
||||||
|
'per_page': 20,
|
||||||
|
'message': 'Alerts list endpoint - to be implemented in Phase 4'
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/rules', methods=['GET'])
|
||||||
|
def list_alert_rules():
|
||||||
|
"""
|
||||||
|
List all alert rules.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with alert rules
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 4
|
||||||
|
return jsonify({
|
||||||
|
'rules': [],
|
||||||
|
'message': 'Alert rules list endpoint - to be implemented in Phase 4'
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/rules', methods=['POST'])
|
||||||
|
def create_alert_rule():
|
||||||
|
"""
|
||||||
|
Create a new alert rule.
|
||||||
|
|
||||||
|
Request body:
|
||||||
|
rule_type: Type of alert rule
|
||||||
|
threshold: Threshold value (e.g., days for cert expiry)
|
||||||
|
enabled: Whether rule is active (default: true)
|
||||||
|
email_enabled: Send email for this rule (default: false)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with created rule ID
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 4
|
||||||
|
data = request.get_json() or {}
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'rule_id': None,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Alert rule creation endpoint - to be implemented in Phase 4',
|
||||||
|
'data': data
|
||||||
|
}), 501
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/rules/<int:rule_id>', methods=['PUT'])
|
||||||
|
def update_alert_rule(rule_id):
|
||||||
|
"""
|
||||||
|
Update an existing alert rule.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
rule_id: Alert rule ID to update
|
||||||
|
|
||||||
|
Request body:
|
||||||
|
threshold: Threshold value (optional)
|
||||||
|
enabled: Whether rule is active (optional)
|
||||||
|
email_enabled: Send email for this rule (optional)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with update status
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 4
|
||||||
|
data = request.get_json() or {}
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'rule_id': rule_id,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Alert rule update endpoint - to be implemented in Phase 4',
|
||||||
|
'data': data
|
||||||
|
}), 501
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/rules/<int:rule_id>', methods=['DELETE'])
|
||||||
|
def delete_alert_rule(rule_id):
|
||||||
|
"""
|
||||||
|
Delete an alert rule.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
rule_id: Alert rule ID to delete
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with deletion status
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 4
|
||||||
|
return jsonify({
|
||||||
|
'rule_id': rule_id,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Alert rule deletion endpoint - to be implemented in Phase 4'
|
||||||
|
}), 501
|
||||||
|
|
||||||
|
|
||||||
|
# Health check endpoint
|
||||||
|
@bp.route('/health', methods=['GET'])
|
||||||
|
def health_check():
|
||||||
|
"""
|
||||||
|
Health check endpoint for monitoring.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with API health status
|
||||||
|
"""
|
||||||
|
return jsonify({
|
||||||
|
'status': 'healthy',
|
||||||
|
'api': 'alerts',
|
||||||
|
'version': '1.0.0-phase1'
|
||||||
|
})
|
||||||
150
web/api/scans.py
Normal file
150
web/api/scans.py
Normal file
@@ -0,0 +1,150 @@
|
|||||||
|
"""
|
||||||
|
Scans API blueprint.
|
||||||
|
|
||||||
|
Handles endpoints for triggering scans, listing scan history, and retrieving
|
||||||
|
scan results.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from flask import Blueprint, current_app, jsonify, request
|
||||||
|
|
||||||
|
bp = Blueprint('scans', __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('', methods=['GET'])
|
||||||
|
def list_scans():
|
||||||
|
"""
|
||||||
|
List all scans with pagination.
|
||||||
|
|
||||||
|
Query params:
|
||||||
|
page: Page number (default: 1)
|
||||||
|
per_page: Items per page (default: 20, max: 100)
|
||||||
|
status: Filter by status (running, completed, failed)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with scans list and pagination info
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 2
|
||||||
|
return jsonify({
|
||||||
|
'scans': [],
|
||||||
|
'total': 0,
|
||||||
|
'page': 1,
|
||||||
|
'per_page': 20,
|
||||||
|
'message': 'Scans endpoint - to be implemented in Phase 2'
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<int:scan_id>', methods=['GET'])
|
||||||
|
def get_scan(scan_id):
|
||||||
|
"""
|
||||||
|
Get details for a specific scan.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
scan_id: Scan ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with scan details
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 2
|
||||||
|
return jsonify({
|
||||||
|
'scan_id': scan_id,
|
||||||
|
'message': 'Scan detail endpoint - to be implemented in Phase 2'
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('', methods=['POST'])
|
||||||
|
def trigger_scan():
|
||||||
|
"""
|
||||||
|
Trigger a new scan.
|
||||||
|
|
||||||
|
Request body:
|
||||||
|
config_file: Path to YAML config file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with scan_id and status
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 2
|
||||||
|
data = request.get_json() or {}
|
||||||
|
config_file = data.get('config_file')
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'scan_id': None,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Scan trigger endpoint - to be implemented in Phase 2',
|
||||||
|
'config_file': config_file
|
||||||
|
}), 501 # Not Implemented
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<int:scan_id>', methods=['DELETE'])
|
||||||
|
def delete_scan(scan_id):
|
||||||
|
"""
|
||||||
|
Delete a scan and its associated files.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
scan_id: Scan ID to delete
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with deletion status
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 2
|
||||||
|
return jsonify({
|
||||||
|
'scan_id': scan_id,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Scan deletion endpoint - to be implemented in Phase 2'
|
||||||
|
}), 501
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<int:scan_id>/status', methods=['GET'])
|
||||||
|
def get_scan_status(scan_id):
|
||||||
|
"""
|
||||||
|
Get current status of a running scan.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
scan_id: Scan ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with scan status and progress
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 2
|
||||||
|
return jsonify({
|
||||||
|
'scan_id': scan_id,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'progress': '0%',
|
||||||
|
'message': 'Scan status endpoint - to be implemented in Phase 2'
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<int:scan_id1>/compare/<int:scan_id2>', methods=['GET'])
|
||||||
|
def compare_scans(scan_id1, scan_id2):
|
||||||
|
"""
|
||||||
|
Compare two scans and show differences.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
scan_id1: First scan ID
|
||||||
|
scan_id2: Second scan ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with comparison results
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 4
|
||||||
|
return jsonify({
|
||||||
|
'scan_id1': scan_id1,
|
||||||
|
'scan_id2': scan_id2,
|
||||||
|
'diff': {},
|
||||||
|
'message': 'Scan comparison endpoint - to be implemented in Phase 4'
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
# Health check endpoint
|
||||||
|
@bp.route('/health', methods=['GET'])
|
||||||
|
def health_check():
|
||||||
|
"""
|
||||||
|
Health check endpoint for monitoring.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with API health status
|
||||||
|
"""
|
||||||
|
return jsonify({
|
||||||
|
'status': 'healthy',
|
||||||
|
'api': 'scans',
|
||||||
|
'version': '1.0.0-phase1'
|
||||||
|
})
|
||||||
150
web/api/schedules.py
Normal file
150
web/api/schedules.py
Normal file
@@ -0,0 +1,150 @@
|
|||||||
|
"""
|
||||||
|
Schedules API blueprint.
|
||||||
|
|
||||||
|
Handles endpoints for managing scheduled scans including CRUD operations
|
||||||
|
and manual triggering.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from flask import Blueprint, jsonify, request
|
||||||
|
|
||||||
|
bp = Blueprint('schedules', __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('', methods=['GET'])
|
||||||
|
def list_schedules():
|
||||||
|
"""
|
||||||
|
List all schedules.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with schedules list
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 3
|
||||||
|
return jsonify({
|
||||||
|
'schedules': [],
|
||||||
|
'message': 'Schedules list endpoint - to be implemented in Phase 3'
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<int:schedule_id>', methods=['GET'])
|
||||||
|
def get_schedule(schedule_id):
|
||||||
|
"""
|
||||||
|
Get details for a specific schedule.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
schedule_id: Schedule ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with schedule details
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 3
|
||||||
|
return jsonify({
|
||||||
|
'schedule_id': schedule_id,
|
||||||
|
'message': 'Schedule detail endpoint - to be implemented in Phase 3'
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('', methods=['POST'])
|
||||||
|
def create_schedule():
|
||||||
|
"""
|
||||||
|
Create a new schedule.
|
||||||
|
|
||||||
|
Request body:
|
||||||
|
name: Schedule name
|
||||||
|
config_file: Path to YAML config
|
||||||
|
cron_expression: Cron-like schedule expression
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with created schedule ID
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 3
|
||||||
|
data = request.get_json() or {}
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'schedule_id': None,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Schedule creation endpoint - to be implemented in Phase 3',
|
||||||
|
'data': data
|
||||||
|
}), 501
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<int:schedule_id>', methods=['PUT'])
|
||||||
|
def update_schedule(schedule_id):
|
||||||
|
"""
|
||||||
|
Update an existing schedule.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
schedule_id: Schedule ID to update
|
||||||
|
|
||||||
|
Request body:
|
||||||
|
name: Schedule name (optional)
|
||||||
|
config_file: Path to YAML config (optional)
|
||||||
|
cron_expression: Cron-like schedule expression (optional)
|
||||||
|
enabled: Whether schedule is active (optional)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with update status
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 3
|
||||||
|
data = request.get_json() or {}
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'schedule_id': schedule_id,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Schedule update endpoint - to be implemented in Phase 3',
|
||||||
|
'data': data
|
||||||
|
}), 501
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<int:schedule_id>', methods=['DELETE'])
|
||||||
|
def delete_schedule(schedule_id):
|
||||||
|
"""
|
||||||
|
Delete a schedule.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
schedule_id: Schedule ID to delete
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with deletion status
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 3
|
||||||
|
return jsonify({
|
||||||
|
'schedule_id': schedule_id,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Schedule deletion endpoint - to be implemented in Phase 3'
|
||||||
|
}), 501
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<int:schedule_id>/trigger', methods=['POST'])
|
||||||
|
def trigger_schedule(schedule_id):
|
||||||
|
"""
|
||||||
|
Manually trigger a scheduled scan.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
schedule_id: Schedule ID to trigger
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with triggered scan ID
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 3
|
||||||
|
return jsonify({
|
||||||
|
'schedule_id': schedule_id,
|
||||||
|
'scan_id': None,
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Manual schedule trigger endpoint - to be implemented in Phase 3'
|
||||||
|
}), 501
|
||||||
|
|
||||||
|
|
||||||
|
# Health check endpoint
|
||||||
|
@bp.route('/health', methods=['GET'])
|
||||||
|
def health_check():
|
||||||
|
"""
|
||||||
|
Health check endpoint for monitoring.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with API health status
|
||||||
|
"""
|
||||||
|
return jsonify({
|
||||||
|
'status': 'healthy',
|
||||||
|
'api': 'schedules',
|
||||||
|
'version': '1.0.0-phase1'
|
||||||
|
})
|
||||||
267
web/api/settings.py
Normal file
267
web/api/settings.py
Normal file
@@ -0,0 +1,267 @@
|
|||||||
|
"""
|
||||||
|
Settings API blueprint.
|
||||||
|
|
||||||
|
Handles endpoints for managing application settings including SMTP configuration,
|
||||||
|
authentication, and system preferences.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from flask import Blueprint, current_app, jsonify, request
|
||||||
|
|
||||||
|
from web.utils.settings import PasswordManager, SettingsManager
|
||||||
|
|
||||||
|
bp = Blueprint('settings', __name__)
|
||||||
|
|
||||||
|
|
||||||
|
def get_settings_manager():
|
||||||
|
"""Get SettingsManager instance with current DB session."""
|
||||||
|
return SettingsManager(current_app.db_session)
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('', methods=['GET'])
|
||||||
|
def get_settings():
|
||||||
|
"""
|
||||||
|
Get all settings (sanitized - encrypted values masked).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with all settings
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
settings_manager = get_settings_manager()
|
||||||
|
settings = settings_manager.get_all(decrypt=False, sanitize=True)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'status': 'success',
|
||||||
|
'settings': settings
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
current_app.logger.error(f"Failed to retrieve settings: {e}")
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'Failed to retrieve settings'
|
||||||
|
}), 500
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('', methods=['PUT'])
|
||||||
|
def update_settings():
|
||||||
|
"""
|
||||||
|
Update multiple settings at once.
|
||||||
|
|
||||||
|
Request body:
|
||||||
|
settings: Dictionary of setting key-value pairs
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with update status
|
||||||
|
"""
|
||||||
|
# TODO: Add authentication in Phase 2
|
||||||
|
data = request.get_json() or {}
|
||||||
|
settings_dict = data.get('settings', {})
|
||||||
|
|
||||||
|
if not settings_dict:
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'No settings provided'
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
try:
|
||||||
|
settings_manager = get_settings_manager()
|
||||||
|
|
||||||
|
# Update each setting
|
||||||
|
for key, value in settings_dict.items():
|
||||||
|
settings_manager.set(key, value)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'status': 'success',
|
||||||
|
'message': f'Updated {len(settings_dict)} settings'
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
current_app.logger.error(f"Failed to update settings: {e}")
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'Failed to update settings'
|
||||||
|
}), 500
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<string:key>', methods=['GET'])
|
||||||
|
def get_setting(key):
|
||||||
|
"""
|
||||||
|
Get a specific setting by key.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Setting key
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with setting value
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
settings_manager = get_settings_manager()
|
||||||
|
value = settings_manager.get(key)
|
||||||
|
|
||||||
|
if value is None:
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': f'Setting "{key}" not found'
|
||||||
|
}), 404
|
||||||
|
|
||||||
|
# Sanitize if encrypted key
|
||||||
|
if settings_manager._should_encrypt(key):
|
||||||
|
value = '***ENCRYPTED***'
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'status': 'success',
|
||||||
|
'key': key,
|
||||||
|
'value': value
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
current_app.logger.error(f"Failed to retrieve setting {key}: {e}")
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'Failed to retrieve setting'
|
||||||
|
}), 500
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<string:key>', methods=['PUT'])
|
||||||
|
def update_setting(key):
|
||||||
|
"""
|
||||||
|
Update a specific setting.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Setting key
|
||||||
|
|
||||||
|
Request body:
|
||||||
|
value: New value for the setting
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with update status
|
||||||
|
"""
|
||||||
|
# TODO: Add authentication in Phase 2
|
||||||
|
data = request.get_json() or {}
|
||||||
|
value = data.get('value')
|
||||||
|
|
||||||
|
if value is None:
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'No value provided'
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
try:
|
||||||
|
settings_manager = get_settings_manager()
|
||||||
|
settings_manager.set(key, value)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'status': 'success',
|
||||||
|
'message': f'Setting "{key}" updated'
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
current_app.logger.error(f"Failed to update setting {key}: {e}")
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'Failed to update setting'
|
||||||
|
}), 500
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/<string:key>', methods=['DELETE'])
|
||||||
|
def delete_setting(key):
|
||||||
|
"""
|
||||||
|
Delete a setting.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Setting key to delete
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with deletion status
|
||||||
|
"""
|
||||||
|
# TODO: Add authentication in Phase 2
|
||||||
|
try:
|
||||||
|
settings_manager = get_settings_manager()
|
||||||
|
deleted = settings_manager.delete(key)
|
||||||
|
|
||||||
|
if not deleted:
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': f'Setting "{key}" not found'
|
||||||
|
}), 404
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'status': 'success',
|
||||||
|
'message': f'Setting "{key}" deleted'
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
current_app.logger.error(f"Failed to delete setting {key}: {e}")
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'Failed to delete setting'
|
||||||
|
}), 500
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/password', methods=['POST'])
|
||||||
|
def set_password():
|
||||||
|
"""
|
||||||
|
Set the application password.
|
||||||
|
|
||||||
|
Request body:
|
||||||
|
password: New password
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with status
|
||||||
|
"""
|
||||||
|
# TODO: Add current password verification in Phase 2
|
||||||
|
data = request.get_json() or {}
|
||||||
|
password = data.get('password')
|
||||||
|
|
||||||
|
if not password:
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'No password provided'
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
if len(password) < 8:
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'Password must be at least 8 characters'
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
try:
|
||||||
|
settings_manager = get_settings_manager()
|
||||||
|
PasswordManager.set_app_password(settings_manager, password)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'status': 'success',
|
||||||
|
'message': 'Password updated successfully'
|
||||||
|
})
|
||||||
|
except Exception as e:
|
||||||
|
current_app.logger.error(f"Failed to set password: {e}")
|
||||||
|
return jsonify({
|
||||||
|
'status': 'error',
|
||||||
|
'message': 'Failed to set password'
|
||||||
|
}), 500
|
||||||
|
|
||||||
|
|
||||||
|
@bp.route('/test-email', methods=['POST'])
|
||||||
|
def test_email():
|
||||||
|
"""
|
||||||
|
Test email configuration by sending a test email.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with test result
|
||||||
|
"""
|
||||||
|
# TODO: Implement in Phase 4 (email support)
|
||||||
|
return jsonify({
|
||||||
|
'status': 'not_implemented',
|
||||||
|
'message': 'Email testing endpoint - to be implemented in Phase 4'
|
||||||
|
}), 501
|
||||||
|
|
||||||
|
|
||||||
|
# Health check endpoint
|
||||||
|
@bp.route('/health', methods=['GET'])
|
||||||
|
def health_check():
|
||||||
|
"""
|
||||||
|
Health check endpoint for monitoring.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON response with API health status
|
||||||
|
"""
|
||||||
|
return jsonify({
|
||||||
|
'status': 'healthy',
|
||||||
|
'api': 'settings',
|
||||||
|
'version': '1.0.0-phase1'
|
||||||
|
})
|
||||||
292
web/app.py
Normal file
292
web/app.py
Normal file
@@ -0,0 +1,292 @@
|
|||||||
|
"""
|
||||||
|
Flask application factory for SneakyScanner web interface.
|
||||||
|
|
||||||
|
This module creates and configures the Flask application with all necessary
|
||||||
|
extensions, blueprints, and middleware.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from flask import Flask, jsonify
|
||||||
|
from flask_cors import CORS
|
||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.orm import scoped_session, sessionmaker
|
||||||
|
|
||||||
|
from web.models import Base
|
||||||
|
|
||||||
|
|
||||||
|
def create_app(config: dict = None) -> Flask:
|
||||||
|
"""
|
||||||
|
Create and configure the Flask application.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
config: Optional configuration dictionary to override defaults
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Configured Flask application instance
|
||||||
|
"""
|
||||||
|
app = Flask(__name__,
|
||||||
|
instance_relative_config=True,
|
||||||
|
static_folder='static',
|
||||||
|
template_folder='templates')
|
||||||
|
|
||||||
|
# Load default configuration
|
||||||
|
app.config.from_mapping(
|
||||||
|
SECRET_KEY=os.environ.get('SECRET_KEY', 'dev-secret-key-change-in-production'),
|
||||||
|
SQLALCHEMY_DATABASE_URI=os.environ.get('DATABASE_URL', 'sqlite:///./sneakyscanner.db'),
|
||||||
|
SQLALCHEMY_TRACK_MODIFICATIONS=False,
|
||||||
|
JSON_SORT_KEYS=False, # Preserve order in JSON responses
|
||||||
|
MAX_CONTENT_LENGTH=50 * 1024 * 1024, # 50MB max upload size
|
||||||
|
)
|
||||||
|
|
||||||
|
# Override with custom config if provided
|
||||||
|
if config:
|
||||||
|
app.config.update(config)
|
||||||
|
|
||||||
|
# Ensure instance folder exists
|
||||||
|
try:
|
||||||
|
os.makedirs(app.instance_path, exist_ok=True)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Configure logging
|
||||||
|
configure_logging(app)
|
||||||
|
|
||||||
|
# Initialize database
|
||||||
|
init_database(app)
|
||||||
|
|
||||||
|
# Initialize extensions
|
||||||
|
init_extensions(app)
|
||||||
|
|
||||||
|
# Register blueprints
|
||||||
|
register_blueprints(app)
|
||||||
|
|
||||||
|
# Register error handlers
|
||||||
|
register_error_handlers(app)
|
||||||
|
|
||||||
|
# Add request/response handlers
|
||||||
|
register_request_handlers(app)
|
||||||
|
|
||||||
|
app.logger.info("SneakyScanner Flask app initialized")
|
||||||
|
|
||||||
|
return app
|
||||||
|
|
||||||
|
|
||||||
|
def configure_logging(app: Flask) -> None:
|
||||||
|
"""
|
||||||
|
Configure application logging.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: Flask application instance
|
||||||
|
"""
|
||||||
|
# Set log level from environment or default to INFO
|
||||||
|
log_level = os.environ.get('LOG_LEVEL', 'INFO').upper()
|
||||||
|
app.logger.setLevel(getattr(logging, log_level, logging.INFO))
|
||||||
|
|
||||||
|
# Create logs directory if it doesn't exist
|
||||||
|
log_dir = Path('logs')
|
||||||
|
log_dir.mkdir(exist_ok=True)
|
||||||
|
|
||||||
|
# File handler for all logs
|
||||||
|
file_handler = logging.FileHandler(log_dir / 'sneakyscanner.log')
|
||||||
|
file_handler.setLevel(logging.INFO)
|
||||||
|
file_formatter = logging.Formatter(
|
||||||
|
'%(asctime)s [%(levelname)s] %(name)s: %(message)s',
|
||||||
|
datefmt='%Y-%m-%d %H:%M:%S'
|
||||||
|
)
|
||||||
|
file_handler.setFormatter(file_formatter)
|
||||||
|
app.logger.addHandler(file_handler)
|
||||||
|
|
||||||
|
# Console handler for development
|
||||||
|
if app.debug:
|
||||||
|
console_handler = logging.StreamHandler()
|
||||||
|
console_handler.setLevel(logging.DEBUG)
|
||||||
|
console_formatter = logging.Formatter(
|
||||||
|
'[%(levelname)s] %(name)s: %(message)s'
|
||||||
|
)
|
||||||
|
console_handler.setFormatter(console_formatter)
|
||||||
|
app.logger.addHandler(console_handler)
|
||||||
|
|
||||||
|
|
||||||
|
def init_database(app: Flask) -> None:
|
||||||
|
"""
|
||||||
|
Initialize database connection and session management.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: Flask application instance
|
||||||
|
"""
|
||||||
|
# Create engine
|
||||||
|
engine = create_engine(
|
||||||
|
app.config['SQLALCHEMY_DATABASE_URI'],
|
||||||
|
echo=app.debug, # Log SQL in debug mode
|
||||||
|
pool_pre_ping=True, # Verify connections before using
|
||||||
|
pool_recycle=3600, # Recycle connections after 1 hour
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create scoped session factory
|
||||||
|
db_session = scoped_session(
|
||||||
|
sessionmaker(
|
||||||
|
autocommit=False,
|
||||||
|
autoflush=False,
|
||||||
|
bind=engine
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Store session in app for use in views
|
||||||
|
app.db_session = db_session
|
||||||
|
|
||||||
|
# Create tables if they don't exist (for development)
|
||||||
|
# In production, use Alembic migrations instead
|
||||||
|
if app.debug:
|
||||||
|
Base.metadata.create_all(bind=engine)
|
||||||
|
|
||||||
|
@app.teardown_appcontext
|
||||||
|
def shutdown_session(exception=None):
|
||||||
|
"""Remove database session at end of request."""
|
||||||
|
db_session.remove()
|
||||||
|
|
||||||
|
app.logger.info(f"Database initialized: {app.config['SQLALCHEMY_DATABASE_URI']}")
|
||||||
|
|
||||||
|
|
||||||
|
def init_extensions(app: Flask) -> None:
|
||||||
|
"""
|
||||||
|
Initialize Flask extensions.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: Flask application instance
|
||||||
|
"""
|
||||||
|
# CORS support for API
|
||||||
|
CORS(app, resources={
|
||||||
|
r"/api/*": {
|
||||||
|
"origins": os.environ.get('CORS_ORIGINS', '*').split(','),
|
||||||
|
"methods": ["GET", "POST", "PUT", "DELETE", "OPTIONS"],
|
||||||
|
"allow_headers": ["Content-Type", "Authorization"],
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
app.logger.info("Extensions initialized")
|
||||||
|
|
||||||
|
|
||||||
|
def register_blueprints(app: Flask) -> None:
|
||||||
|
"""
|
||||||
|
Register Flask blueprints for different app sections.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: Flask application instance
|
||||||
|
"""
|
||||||
|
# Import blueprints
|
||||||
|
from web.api.scans import bp as scans_bp
|
||||||
|
from web.api.schedules import bp as schedules_bp
|
||||||
|
from web.api.alerts import bp as alerts_bp
|
||||||
|
from web.api.settings import bp as settings_bp
|
||||||
|
|
||||||
|
# Register API blueprints
|
||||||
|
app.register_blueprint(scans_bp, url_prefix='/api/scans')
|
||||||
|
app.register_blueprint(schedules_bp, url_prefix='/api/schedules')
|
||||||
|
app.register_blueprint(alerts_bp, url_prefix='/api/alerts')
|
||||||
|
app.register_blueprint(settings_bp, url_prefix='/api/settings')
|
||||||
|
|
||||||
|
app.logger.info("Blueprints registered")
|
||||||
|
|
||||||
|
|
||||||
|
def register_error_handlers(app: Flask) -> None:
|
||||||
|
"""
|
||||||
|
Register error handlers for common HTTP errors.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: Flask application instance
|
||||||
|
"""
|
||||||
|
@app.errorhandler(400)
|
||||||
|
def bad_request(error):
|
||||||
|
return jsonify({
|
||||||
|
'error': 'Bad Request',
|
||||||
|
'message': str(error) or 'The request was invalid'
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
@app.errorhandler(401)
|
||||||
|
def unauthorized(error):
|
||||||
|
return jsonify({
|
||||||
|
'error': 'Unauthorized',
|
||||||
|
'message': 'Authentication required'
|
||||||
|
}), 401
|
||||||
|
|
||||||
|
@app.errorhandler(403)
|
||||||
|
def forbidden(error):
|
||||||
|
return jsonify({
|
||||||
|
'error': 'Forbidden',
|
||||||
|
'message': 'You do not have permission to access this resource'
|
||||||
|
}), 403
|
||||||
|
|
||||||
|
@app.errorhandler(404)
|
||||||
|
def not_found(error):
|
||||||
|
return jsonify({
|
||||||
|
'error': 'Not Found',
|
||||||
|
'message': 'The requested resource was not found'
|
||||||
|
}), 404
|
||||||
|
|
||||||
|
@app.errorhandler(405)
|
||||||
|
def method_not_allowed(error):
|
||||||
|
return jsonify({
|
||||||
|
'error': 'Method Not Allowed',
|
||||||
|
'message': 'The HTTP method is not allowed for this endpoint'
|
||||||
|
}), 405
|
||||||
|
|
||||||
|
@app.errorhandler(500)
|
||||||
|
def internal_server_error(error):
|
||||||
|
app.logger.error(f"Internal server error: {error}")
|
||||||
|
return jsonify({
|
||||||
|
'error': 'Internal Server Error',
|
||||||
|
'message': 'An unexpected error occurred'
|
||||||
|
}), 500
|
||||||
|
|
||||||
|
|
||||||
|
def register_request_handlers(app: Flask) -> None:
|
||||||
|
"""
|
||||||
|
Register request and response handlers.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
app: Flask application instance
|
||||||
|
"""
|
||||||
|
@app.before_request
|
||||||
|
def log_request():
|
||||||
|
"""Log incoming requests."""
|
||||||
|
if app.debug:
|
||||||
|
app.logger.debug(f"{request.method} {request.path}")
|
||||||
|
|
||||||
|
@app.after_request
|
||||||
|
def add_security_headers(response):
|
||||||
|
"""Add security headers to all responses."""
|
||||||
|
# Only add CORS and security headers for API routes
|
||||||
|
if request.path.startswith('/api/'):
|
||||||
|
response.headers['X-Content-Type-Options'] = 'nosniff'
|
||||||
|
response.headers['X-Frame-Options'] = 'DENY'
|
||||||
|
response.headers['X-XSS-Protection'] = '1; mode=block'
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
# Import request at runtime to avoid circular imports
|
||||||
|
from flask import request
|
||||||
|
|
||||||
|
# Re-apply to ensure request is available
|
||||||
|
@app.before_request
|
||||||
|
def log_request():
|
||||||
|
"""Log incoming requests."""
|
||||||
|
if app.debug:
|
||||||
|
app.logger.debug(f"{request.method} {request.path}")
|
||||||
|
|
||||||
|
|
||||||
|
# Development server entry point
|
||||||
|
def main():
|
||||||
|
"""Run development server."""
|
||||||
|
app = create_app()
|
||||||
|
app.run(
|
||||||
|
host=os.environ.get('FLASK_HOST', '0.0.0.0'),
|
||||||
|
port=int(os.environ.get('FLASK_PORT', 5000)),
|
||||||
|
debug=os.environ.get('FLASK_DEBUG', 'True').lower() == 'true'
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
345
web/models.py
Normal file
345
web/models.py
Normal file
@@ -0,0 +1,345 @@
|
|||||||
|
"""
|
||||||
|
SQLAlchemy models for SneakyScanner database.
|
||||||
|
|
||||||
|
This module defines all database tables for storing scan results, schedules,
|
||||||
|
alerts, and application settings. The schema supports the full scanning workflow
|
||||||
|
from port discovery through service detection and SSL/TLS analysis.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from sqlalchemy import (
|
||||||
|
Boolean,
|
||||||
|
Column,
|
||||||
|
DateTime,
|
||||||
|
Float,
|
||||||
|
ForeignKey,
|
||||||
|
Integer,
|
||||||
|
String,
|
||||||
|
Text,
|
||||||
|
UniqueConstraint,
|
||||||
|
)
|
||||||
|
from sqlalchemy.orm import DeclarativeBase, relationship
|
||||||
|
|
||||||
|
|
||||||
|
class Base(DeclarativeBase):
|
||||||
|
"""Base class for all SQLAlchemy models."""
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Core Scan Tables
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
|
||||||
|
class Scan(Base):
|
||||||
|
"""
|
||||||
|
Stores metadata about each scan execution.
|
||||||
|
|
||||||
|
This is the parent table that ties together all scan results including
|
||||||
|
sites, IPs, ports, services, certificates, and TLS configuration.
|
||||||
|
"""
|
||||||
|
__tablename__ = 'scans'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
timestamp = Column(DateTime, nullable=False, index=True, comment="Scan start time (UTC)")
|
||||||
|
duration = Column(Float, nullable=True, comment="Total scan duration in seconds")
|
||||||
|
status = Column(String(20), nullable=False, default='running', comment="running, completed, failed")
|
||||||
|
config_file = Column(Text, nullable=True, comment="Path to YAML config used")
|
||||||
|
title = Column(Text, nullable=True, comment="Scan title from config")
|
||||||
|
json_path = Column(Text, nullable=True, comment="Path to JSON report")
|
||||||
|
html_path = Column(Text, nullable=True, comment="Path to HTML report")
|
||||||
|
zip_path = Column(Text, nullable=True, comment="Path to ZIP archive")
|
||||||
|
screenshot_dir = Column(Text, nullable=True, comment="Path to screenshot directory")
|
||||||
|
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Record creation time")
|
||||||
|
triggered_by = Column(String(50), nullable=False, default='manual', comment="manual, scheduled, api")
|
||||||
|
schedule_id = Column(Integer, ForeignKey('schedules.id'), nullable=True, comment="FK to schedules if triggered by schedule")
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
sites = relationship('ScanSite', back_populates='scan', cascade='all, delete-orphan')
|
||||||
|
ips = relationship('ScanIP', back_populates='scan', cascade='all, delete-orphan')
|
||||||
|
ports = relationship('ScanPort', back_populates='scan', cascade='all, delete-orphan')
|
||||||
|
services = relationship('ScanService', back_populates='scan', cascade='all, delete-orphan')
|
||||||
|
certificates = relationship('ScanCertificate', back_populates='scan', cascade='all, delete-orphan')
|
||||||
|
tls_versions = relationship('ScanTLSVersion', back_populates='scan', cascade='all, delete-orphan')
|
||||||
|
alerts = relationship('Alert', back_populates='scan', cascade='all, delete-orphan')
|
||||||
|
schedule = relationship('Schedule', back_populates='scans')
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<Scan(id={self.id}, title='{self.title}', status='{self.status}')>"
|
||||||
|
|
||||||
|
|
||||||
|
class ScanSite(Base):
|
||||||
|
"""
|
||||||
|
Logical grouping of IPs by site.
|
||||||
|
|
||||||
|
Sites represent logical network segments or locations (e.g., "Production DC",
|
||||||
|
"DMZ", "Branch Office") as defined in the scan configuration.
|
||||||
|
"""
|
||||||
|
__tablename__ = 'scan_sites'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||||
|
site_name = Column(String(255), nullable=False, comment="Site name from config")
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
scan = relationship('Scan', back_populates='sites')
|
||||||
|
ips = relationship('ScanIP', back_populates='site', cascade='all, delete-orphan')
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<ScanSite(id={self.id}, site_name='{self.site_name}')>"
|
||||||
|
|
||||||
|
|
||||||
|
class ScanIP(Base):
|
||||||
|
"""
|
||||||
|
IP addresses scanned in each scan.
|
||||||
|
|
||||||
|
Stores the target IPs and their ping response status (expected vs. actual).
|
||||||
|
"""
|
||||||
|
__tablename__ = 'scan_ips'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||||
|
site_id = Column(Integer, ForeignKey('scan_sites.id'), nullable=False, index=True)
|
||||||
|
ip_address = Column(String(45), nullable=False, comment="IPv4 or IPv6 address")
|
||||||
|
ping_expected = Column(Boolean, nullable=True, comment="Expected ping response")
|
||||||
|
ping_actual = Column(Boolean, nullable=True, comment="Actual ping response")
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
scan = relationship('Scan', back_populates='ips')
|
||||||
|
site = relationship('ScanSite', back_populates='ips')
|
||||||
|
ports = relationship('ScanPort', back_populates='ip', cascade='all, delete-orphan')
|
||||||
|
|
||||||
|
# Index for efficient IP lookups within a scan
|
||||||
|
__table_args__ = (
|
||||||
|
UniqueConstraint('scan_id', 'ip_address', name='uix_scan_ip'),
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<ScanIP(id={self.id}, ip_address='{self.ip_address}')>"
|
||||||
|
|
||||||
|
|
||||||
|
class ScanPort(Base):
|
||||||
|
"""
|
||||||
|
Discovered TCP/UDP ports.
|
||||||
|
|
||||||
|
Stores all open ports found during masscan phase, along with expected vs.
|
||||||
|
actual status for drift detection.
|
||||||
|
"""
|
||||||
|
__tablename__ = 'scan_ports'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||||
|
ip_id = Column(Integer, ForeignKey('scan_ips.id'), nullable=False, index=True)
|
||||||
|
port = Column(Integer, nullable=False, comment="Port number (1-65535)")
|
||||||
|
protocol = Column(String(10), nullable=False, comment="tcp or udp")
|
||||||
|
expected = Column(Boolean, nullable=True, comment="Was this port expected?")
|
||||||
|
state = Column(String(20), nullable=False, default='open', comment="open, closed, filtered")
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
scan = relationship('Scan', back_populates='ports')
|
||||||
|
ip = relationship('ScanIP', back_populates='ports')
|
||||||
|
services = relationship('ScanService', back_populates='port', cascade='all, delete-orphan')
|
||||||
|
|
||||||
|
# Index for efficient port lookups
|
||||||
|
__table_args__ = (
|
||||||
|
UniqueConstraint('scan_id', 'ip_id', 'port', 'protocol', name='uix_scan_ip_port'),
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<ScanPort(id={self.id}, port={self.port}, protocol='{self.protocol}')>"
|
||||||
|
|
||||||
|
|
||||||
|
class ScanService(Base):
|
||||||
|
"""
|
||||||
|
Detected services on open ports.
|
||||||
|
|
||||||
|
Stores nmap service detection results including product names, versions,
|
||||||
|
and HTTP/HTTPS information with screenshots.
|
||||||
|
"""
|
||||||
|
__tablename__ = 'scan_services'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||||
|
port_id = Column(Integer, ForeignKey('scan_ports.id'), nullable=False, index=True)
|
||||||
|
service_name = Column(String(100), nullable=True, comment="Service name (e.g., ssh, http)")
|
||||||
|
product = Column(String(255), nullable=True, comment="Product name (e.g., OpenSSH)")
|
||||||
|
version = Column(String(100), nullable=True, comment="Version string")
|
||||||
|
extrainfo = Column(Text, nullable=True, comment="Additional nmap info")
|
||||||
|
ostype = Column(String(100), nullable=True, comment="OS type if detected")
|
||||||
|
http_protocol = Column(String(10), nullable=True, comment="http or https (if web service)")
|
||||||
|
screenshot_path = Column(Text, nullable=True, comment="Relative path to screenshot")
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
scan = relationship('Scan', back_populates='services')
|
||||||
|
port = relationship('ScanPort', back_populates='services')
|
||||||
|
certificates = relationship('ScanCertificate', back_populates='service', cascade='all, delete-orphan')
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<ScanService(id={self.id}, service_name='{self.service_name}', product='{self.product}')>"
|
||||||
|
|
||||||
|
|
||||||
|
class ScanCertificate(Base):
|
||||||
|
"""
|
||||||
|
SSL/TLS certificates discovered on HTTPS services.
|
||||||
|
|
||||||
|
Stores certificate details including validity periods, subject/issuer,
|
||||||
|
and flags for self-signed certificates.
|
||||||
|
"""
|
||||||
|
__tablename__ = 'scan_certificates'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||||
|
service_id = Column(Integer, ForeignKey('scan_services.id'), nullable=False, index=True)
|
||||||
|
subject = Column(Text, nullable=True, comment="Certificate subject (CN)")
|
||||||
|
issuer = Column(Text, nullable=True, comment="Certificate issuer")
|
||||||
|
serial_number = Column(Text, nullable=True, comment="Serial number")
|
||||||
|
not_valid_before = Column(DateTime, nullable=True, comment="Validity start date")
|
||||||
|
not_valid_after = Column(DateTime, nullable=True, comment="Validity end date")
|
||||||
|
days_until_expiry = Column(Integer, nullable=True, comment="Days until expiration")
|
||||||
|
sans = Column(Text, nullable=True, comment="JSON array of SANs")
|
||||||
|
is_self_signed = Column(Boolean, nullable=True, default=False, comment="Self-signed certificate flag")
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
scan = relationship('Scan', back_populates='certificates')
|
||||||
|
service = relationship('ScanService', back_populates='certificates')
|
||||||
|
tls_versions = relationship('ScanTLSVersion', back_populates='certificate', cascade='all, delete-orphan')
|
||||||
|
|
||||||
|
# Index for certificate expiration queries
|
||||||
|
__table_args__ = (
|
||||||
|
{'comment': 'Index on expiration date for alert queries'},
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<ScanCertificate(id={self.id}, subject='{self.subject}', days_until_expiry={self.days_until_expiry})>"
|
||||||
|
|
||||||
|
|
||||||
|
class ScanTLSVersion(Base):
|
||||||
|
"""
|
||||||
|
TLS version support and cipher suites.
|
||||||
|
|
||||||
|
Stores which TLS versions (1.0, 1.1, 1.2, 1.3) are supported by each
|
||||||
|
HTTPS service, along with accepted cipher suites.
|
||||||
|
"""
|
||||||
|
__tablename__ = 'scan_tls_versions'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||||
|
certificate_id = Column(Integer, ForeignKey('scan_certificates.id'), nullable=False, index=True)
|
||||||
|
tls_version = Column(String(20), nullable=False, comment="TLS 1.0, TLS 1.1, TLS 1.2, TLS 1.3")
|
||||||
|
supported = Column(Boolean, nullable=False, comment="Is this version supported?")
|
||||||
|
cipher_suites = Column(Text, nullable=True, comment="JSON array of cipher suites")
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
scan = relationship('Scan', back_populates='tls_versions')
|
||||||
|
certificate = relationship('ScanCertificate', back_populates='tls_versions')
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<ScanTLSVersion(id={self.id}, tls_version='{self.tls_version}', supported={self.supported})>"
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Scheduling & Notifications Tables
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
|
||||||
|
class Schedule(Base):
|
||||||
|
"""
|
||||||
|
Scheduled scan configurations.
|
||||||
|
|
||||||
|
Stores cron-like schedules for automated periodic scanning of network
|
||||||
|
infrastructure.
|
||||||
|
"""
|
||||||
|
__tablename__ = 'schedules'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
name = Column(String(255), nullable=False, comment="Schedule name (e.g., 'Daily prod scan')")
|
||||||
|
config_file = Column(Text, nullable=False, comment="Path to YAML config")
|
||||||
|
cron_expression = Column(String(100), nullable=False, comment="Cron-like schedule (e.g., '0 2 * * *')")
|
||||||
|
enabled = Column(Boolean, nullable=False, default=True, comment="Is schedule active?")
|
||||||
|
last_run = Column(DateTime, nullable=True, comment="Last execution time")
|
||||||
|
next_run = Column(DateTime, nullable=True, comment="Next scheduled execution")
|
||||||
|
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Schedule creation time")
|
||||||
|
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow, comment="Last modification time")
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
scans = relationship('Scan', back_populates='schedule')
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<Schedule(id={self.id}, name='{self.name}', enabled={self.enabled})>"
|
||||||
|
|
||||||
|
|
||||||
|
class Alert(Base):
|
||||||
|
"""
|
||||||
|
Alert history and notifications sent.
|
||||||
|
|
||||||
|
Stores all alerts generated by the alert rule engine, including severity
|
||||||
|
levels and email notification status.
|
||||||
|
"""
|
||||||
|
__tablename__ = 'alerts'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
scan_id = Column(Integer, ForeignKey('scans.id'), nullable=False, index=True)
|
||||||
|
alert_type = Column(String(50), nullable=False, comment="new_port, cert_expiry, service_change, ping_failed")
|
||||||
|
severity = Column(String(20), nullable=False, comment="info, warning, critical")
|
||||||
|
message = Column(Text, nullable=False, comment="Human-readable alert message")
|
||||||
|
ip_address = Column(String(45), nullable=True, comment="Related IP (optional)")
|
||||||
|
port = Column(Integer, nullable=True, comment="Related port (optional)")
|
||||||
|
email_sent = Column(Boolean, nullable=False, default=False, comment="Was email notification sent?")
|
||||||
|
email_sent_at = Column(DateTime, nullable=True, comment="Email send timestamp")
|
||||||
|
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Alert creation time")
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
scan = relationship('Scan', back_populates='alerts')
|
||||||
|
|
||||||
|
# Index for alert queries by type and severity
|
||||||
|
__table_args__ = (
|
||||||
|
{'comment': 'Indexes for alert filtering'},
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<Alert(id={self.id}, alert_type='{self.alert_type}', severity='{self.severity}')>"
|
||||||
|
|
||||||
|
|
||||||
|
class AlertRule(Base):
|
||||||
|
"""
|
||||||
|
User-defined alert rules.
|
||||||
|
|
||||||
|
Configurable rules that trigger alerts based on scan results (e.g.,
|
||||||
|
certificates expiring in <30 days, unexpected ports opened).
|
||||||
|
"""
|
||||||
|
__tablename__ = 'alert_rules'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
rule_type = Column(String(50), nullable=False, comment="unexpected_port, cert_expiry, service_down, etc.")
|
||||||
|
enabled = Column(Boolean, nullable=False, default=True, comment="Is rule active?")
|
||||||
|
threshold = Column(Integer, nullable=True, comment="Threshold value (e.g., days for cert expiry)")
|
||||||
|
email_enabled = Column(Boolean, nullable=False, default=False, comment="Send email for this rule?")
|
||||||
|
created_at = Column(DateTime, nullable=False, default=datetime.utcnow, comment="Rule creation time")
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<AlertRule(id={self.id}, rule_type='{self.rule_type}', enabled={self.enabled})>"
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Settings Table
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
|
||||||
|
class Setting(Base):
|
||||||
|
"""
|
||||||
|
Application configuration key-value store.
|
||||||
|
|
||||||
|
Stores application settings including SMTP configuration, authentication,
|
||||||
|
and retention policies. Values stored as JSON for complex data types.
|
||||||
|
"""
|
||||||
|
__tablename__ = 'settings'
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
key = Column(String(255), nullable=False, unique=True, index=True, comment="Setting key (e.g., smtp_server)")
|
||||||
|
value = Column(Text, nullable=True, comment="Setting value (JSON for complex values)")
|
||||||
|
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow, comment="Last modification time")
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return f"<Setting(key='{self.key}', value='{self.value[:50] if self.value else None}...')>"
|
||||||
0
web/utils/__init__.py
Normal file
0
web/utils/__init__.py
Normal file
323
web/utils/settings.py
Normal file
323
web/utils/settings.py
Normal file
@@ -0,0 +1,323 @@
|
|||||||
|
"""
|
||||||
|
Settings management system for SneakyScanner.
|
||||||
|
|
||||||
|
Provides secure storage and retrieval of application settings with encryption
|
||||||
|
for sensitive values like passwords and API tokens.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Any, Dict, List, Optional
|
||||||
|
|
||||||
|
import bcrypt
|
||||||
|
from cryptography.fernet import Fernet
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
from web.models import Setting
|
||||||
|
|
||||||
|
|
||||||
|
class SettingsManager:
|
||||||
|
"""
|
||||||
|
Manages application settings with encryption support.
|
||||||
|
|
||||||
|
Handles CRUD operations for settings stored in the database, with automatic
|
||||||
|
encryption/decryption for sensitive values.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Keys that should be encrypted when stored
|
||||||
|
ENCRYPTED_KEYS = {
|
||||||
|
'smtp_password',
|
||||||
|
'api_token',
|
||||||
|
'encryption_key',
|
||||||
|
}
|
||||||
|
|
||||||
|
def __init__(self, db_session: Session, encryption_key: Optional[bytes] = None):
|
||||||
|
"""
|
||||||
|
Initialize the settings manager.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_session: SQLAlchemy database session
|
||||||
|
encryption_key: Fernet encryption key (32 url-safe base64-encoded bytes)
|
||||||
|
If not provided, will generate or load from environment
|
||||||
|
"""
|
||||||
|
self.db = db_session
|
||||||
|
self._encryption_key = encryption_key or self._get_or_create_encryption_key()
|
||||||
|
self._cipher = Fernet(self._encryption_key)
|
||||||
|
|
||||||
|
def _get_or_create_encryption_key(self) -> bytes:
|
||||||
|
"""
|
||||||
|
Get encryption key from environment or generate new one.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Fernet encryption key (32 url-safe base64-encoded bytes)
|
||||||
|
"""
|
||||||
|
# Try to get from environment variable
|
||||||
|
key_str = os.environ.get('SNEAKYSCANNER_ENCRYPTION_KEY')
|
||||||
|
if key_str:
|
||||||
|
return key_str.encode()
|
||||||
|
|
||||||
|
# Try to get from settings table (for persistence)
|
||||||
|
existing_key = self.get('encryption_key', decrypt=False)
|
||||||
|
if existing_key:
|
||||||
|
return existing_key.encode()
|
||||||
|
|
||||||
|
# Generate new key if none exists
|
||||||
|
new_key = Fernet.generate_key()
|
||||||
|
# Store it in settings (unencrypted, as it's the key itself)
|
||||||
|
self._store_raw('encryption_key', new_key.decode())
|
||||||
|
return new_key
|
||||||
|
|
||||||
|
def _store_raw(self, key: str, value: str) -> None:
|
||||||
|
"""Store a setting without encryption (internal use only)."""
|
||||||
|
setting = self.db.query(Setting).filter_by(key=key).first()
|
||||||
|
if setting:
|
||||||
|
setting.value = value
|
||||||
|
setting.updated_at = datetime.utcnow()
|
||||||
|
else:
|
||||||
|
setting = Setting(key=key, value=value)
|
||||||
|
self.db.add(setting)
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
def _should_encrypt(self, key: str) -> bool:
|
||||||
|
"""Check if a setting key should be encrypted."""
|
||||||
|
return key in self.ENCRYPTED_KEYS
|
||||||
|
|
||||||
|
def _encrypt(self, value: str) -> str:
|
||||||
|
"""Encrypt a string value."""
|
||||||
|
return self._cipher.encrypt(value.encode()).decode()
|
||||||
|
|
||||||
|
def _decrypt(self, encrypted_value: str) -> str:
|
||||||
|
"""Decrypt an encrypted value."""
|
||||||
|
return self._cipher.decrypt(encrypted_value.encode()).decode()
|
||||||
|
|
||||||
|
def get(self, key: str, default: Any = None, decrypt: bool = True) -> Any:
|
||||||
|
"""
|
||||||
|
Get a setting value by key.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Setting key to retrieve
|
||||||
|
default: Default value if key not found
|
||||||
|
decrypt: Whether to decrypt if value is encrypted
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Setting value (automatically decrypts if needed and decrypt=True)
|
||||||
|
"""
|
||||||
|
setting = self.db.query(Setting).filter_by(key=key).first()
|
||||||
|
if not setting:
|
||||||
|
return default
|
||||||
|
|
||||||
|
value = setting.value
|
||||||
|
if value is None:
|
||||||
|
return default
|
||||||
|
|
||||||
|
# Decrypt if needed
|
||||||
|
if decrypt and self._should_encrypt(key):
|
||||||
|
try:
|
||||||
|
value = self._decrypt(value)
|
||||||
|
except Exception:
|
||||||
|
# If decryption fails, return as-is (might be legacy unencrypted value)
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Try to parse JSON for complex types
|
||||||
|
if value.startswith('[') or value.startswith('{'):
|
||||||
|
try:
|
||||||
|
return json.loads(value)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
def set(self, key: str, value: Any, encrypt: bool = None) -> None:
|
||||||
|
"""
|
||||||
|
Set a setting value.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Setting key
|
||||||
|
value: Setting value (will be JSON-encoded if dict/list)
|
||||||
|
encrypt: Force encryption on/off (None = auto-detect from ENCRYPTED_KEYS)
|
||||||
|
"""
|
||||||
|
# Convert complex types to JSON
|
||||||
|
if isinstance(value, (dict, list)):
|
||||||
|
value_str = json.dumps(value)
|
||||||
|
else:
|
||||||
|
value_str = str(value)
|
||||||
|
|
||||||
|
# Determine if we should encrypt
|
||||||
|
should_encrypt = encrypt if encrypt is not None else self._should_encrypt(key)
|
||||||
|
|
||||||
|
if should_encrypt:
|
||||||
|
value_str = self._encrypt(value_str)
|
||||||
|
|
||||||
|
# Store in database
|
||||||
|
setting = self.db.query(Setting).filter_by(key=key).first()
|
||||||
|
if setting:
|
||||||
|
setting.value = value_str
|
||||||
|
setting.updated_at = datetime.utcnow()
|
||||||
|
else:
|
||||||
|
setting = Setting(key=key, value=value_str)
|
||||||
|
self.db.add(setting)
|
||||||
|
|
||||||
|
self.db.commit()
|
||||||
|
|
||||||
|
def delete(self, key: str) -> bool:
|
||||||
|
"""
|
||||||
|
Delete a setting.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key: Setting key to delete
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if deleted, False if key not found
|
||||||
|
"""
|
||||||
|
setting = self.db.query(Setting).filter_by(key=key).first()
|
||||||
|
if setting:
|
||||||
|
self.db.delete(setting)
|
||||||
|
self.db.commit()
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
def get_all(self, decrypt: bool = False, sanitize: bool = True) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get all settings as a dictionary.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
decrypt: Whether to decrypt encrypted values
|
||||||
|
sanitize: If True, replaces encrypted values with '***' for security
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary of all settings
|
||||||
|
"""
|
||||||
|
settings = self.db.query(Setting).all()
|
||||||
|
result = {}
|
||||||
|
|
||||||
|
for setting in settings:
|
||||||
|
key = setting.key
|
||||||
|
value = setting.value
|
||||||
|
|
||||||
|
if value is None:
|
||||||
|
result[key] = None
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Handle sanitization for sensitive keys
|
||||||
|
if sanitize and self._should_encrypt(key):
|
||||||
|
result[key] = '***ENCRYPTED***'
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Decrypt if requested
|
||||||
|
if decrypt and self._should_encrypt(key):
|
||||||
|
try:
|
||||||
|
value = self._decrypt(value)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Try to parse JSON
|
||||||
|
if value and (value.startswith('[') or value.startswith('{')):
|
||||||
|
try:
|
||||||
|
value = json.loads(value)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
result[key] = value
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
def init_defaults(self) -> None:
|
||||||
|
"""
|
||||||
|
Initialize default settings if they don't exist.
|
||||||
|
|
||||||
|
This should be called on first app startup to populate default values.
|
||||||
|
"""
|
||||||
|
defaults = {
|
||||||
|
# SMTP settings
|
||||||
|
'smtp_server': 'localhost',
|
||||||
|
'smtp_port': 587,
|
||||||
|
'smtp_username': '',
|
||||||
|
'smtp_password': '',
|
||||||
|
'smtp_from_email': 'noreply@sneakyscanner.local',
|
||||||
|
'smtp_to_emails': [],
|
||||||
|
|
||||||
|
# Authentication
|
||||||
|
'app_password': '', # Will need to be set by user
|
||||||
|
|
||||||
|
# Retention policy
|
||||||
|
'retention_days': 0, # 0 = keep forever
|
||||||
|
|
||||||
|
# Alert settings
|
||||||
|
'cert_expiry_threshold': 30, # Days before expiry to alert
|
||||||
|
'email_alerts_enabled': False,
|
||||||
|
}
|
||||||
|
|
||||||
|
for key, value in defaults.items():
|
||||||
|
# Only set if doesn't exist
|
||||||
|
if self.db.query(Setting).filter_by(key=key).first() is None:
|
||||||
|
self.set(key, value)
|
||||||
|
|
||||||
|
|
||||||
|
class PasswordManager:
|
||||||
|
"""
|
||||||
|
Manages password hashing and verification using bcrypt.
|
||||||
|
|
||||||
|
Used for the single-user authentication system.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def hash_password(password: str) -> str:
|
||||||
|
"""
|
||||||
|
Hash a password using bcrypt.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
password: Plain text password
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Bcrypt hash string
|
||||||
|
"""
|
||||||
|
return bcrypt.hashpw(password.encode(), bcrypt.gensalt()).decode()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def verify_password(password: str, hashed: str) -> bool:
|
||||||
|
"""
|
||||||
|
Verify a password against a bcrypt hash.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
password: Plain text password to verify
|
||||||
|
hashed: Bcrypt hash to check against
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if password matches, False otherwise
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
return bcrypt.checkpw(password.encode(), hashed.encode())
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def set_app_password(settings_manager: SettingsManager, password: str) -> None:
|
||||||
|
"""
|
||||||
|
Set the application password (stored as bcrypt hash).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
settings_manager: SettingsManager instance
|
||||||
|
password: New password to set
|
||||||
|
"""
|
||||||
|
hashed = PasswordManager.hash_password(password)
|
||||||
|
# Password hash stored as regular setting (not encrypted, as it's already a hash)
|
||||||
|
settings_manager.set('app_password', hashed, encrypt=False)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def verify_app_password(settings_manager: SettingsManager, password: str) -> bool:
|
||||||
|
"""
|
||||||
|
Verify the application password.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
settings_manager: SettingsManager instance
|
||||||
|
password: Password to verify
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if password matches, False otherwise
|
||||||
|
"""
|
||||||
|
stored_hash = settings_manager.get('app_password', decrypt=False)
|
||||||
|
if not stored_hash:
|
||||||
|
# No password set - should prompt user to create one
|
||||||
|
return False
|
||||||
|
return PasswordManager.verify_password(password, stored_hash)
|
||||||
Reference in New Issue
Block a user