Commit Graph

2 Commits

Author SHA1 Message Date
ee0c5a2c3c Phase 2 Step 3: Implement Background Job Queue
Implemented APScheduler integration for background scan execution,
enabling async job processing without blocking HTTP requests.

## Changes

### Background Jobs (web/jobs/)
- scan_job.py - Execute scans in background threads
  - execute_scan() with isolated database sessions
  - Comprehensive error handling and logging
  - Scan status lifecycle tracking
  - Timing and error message storage

### Scheduler Service (web/services/scheduler_service.py)
- SchedulerService class for job management
- APScheduler BackgroundScheduler integration
- ThreadPoolExecutor for concurrent jobs (max 3 workers)
- queue_scan() - Immediate job execution
- Job monitoring: list_jobs(), get_job_status()
- Graceful shutdown handling

### Flask Integration (web/app.py)
- init_scheduler() function
- Scheduler initialization in app factory
- Stored scheduler in app context (app.scheduler)

### Database Schema (migration 003)
- Added scan timing fields:
  - started_at - Scan execution start time
  - completed_at - Scan execution completion time
  - error_message - Error details for failed scans

### Service Layer Updates (web/services/scan_service.py)
- trigger_scan() accepts scheduler parameter
- Queues background jobs after creating scan record
- get_scan_status() includes new timing and error fields
- _save_scan_to_db() sets completed_at timestamp

### API Updates (web/api/scans.py)
- POST /api/scans passes scheduler to trigger_scan()
- Scans now execute in background automatically

### Model Updates (web/models.py)
- Added started_at, completed_at, error_message to Scan model

### Testing (tests/test_background_jobs.py)
- 13 unit tests for background job execution
- Scheduler initialization and configuration tests
- Job queuing and status tracking tests
- Scan timing field tests
- Error handling and storage tests
- Integration test for full workflow (skipped by default)

## Features

- Async scan execution without blocking HTTP requests
- Concurrent scan support (configurable max workers)
- Isolated database sessions per background thread
- Scan lifecycle tracking: created → running → completed/failed
- Error messages captured and stored in database
- Job monitoring and management capabilities
- Graceful shutdown waits for running jobs

## Implementation Notes

- Scanner runs in subprocess from background thread
- Docker provides necessary privileges (--privileged, --network host)
- Each job gets isolated SQLAlchemy session (avoid locking)
- Job IDs follow pattern: scan_{scan_id}
- Background jobs survive across requests
- Failed jobs store error messages in database

## Documentation (docs/ai/PHASE2.md)
- Updated progress: 6/14 days complete (43%)
- Marked Step 3 as complete
- Added detailed implementation notes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-14 09:24:00 -06:00
986c0d3d17 Complete Phase 1: Foundation - Flask web application infrastructure
Implement complete database schema and Flask application structure for
SneakyScan web interface. This establishes the foundation for web-based
scan management, scheduling, and visualization.

Database & ORM:
- Add 11 SQLAlchemy models for comprehensive scan data storage
  (Scan, ScanSite, ScanIP, ScanPort, ScanService, ScanCertificate,
  ScanTLSVersion, Schedule, Alert, AlertRule, Setting)
- Configure Alembic migrations system with initial schema migration
- Add init_db.py script for database initialization and password setup
- Support both migration-based and direct table creation

Settings System:
- Implement SettingsManager with automatic encryption for sensitive values
- Add Fernet encryption for SMTP passwords and API tokens
- Implement PasswordManager with bcrypt password hashing (work factor 12)
- Initialize default settings for SMTP, authentication, and retention

Flask Application:
- Create Flask app factory pattern with scoped session management
- Add 4 API blueprints: scans, schedules, alerts, settings
- Implement functional Settings API (GET/PUT/DELETE endpoints)
- Add CORS support, error handlers, and request/response logging
- Configure development and production logging to file and console

Docker & Deployment:
- Update Dockerfile to install Flask dependencies
- Add docker-compose-web.yml for web application deployment
- Configure volume mounts for database, output, and logs persistence
- Expose port 5000 for Flask web server

Testing & Validation:
- Add validate_phase1.py script to verify all deliverables
- Validate directory structure, Python syntax, models, and endpoints
- All validation checks passing

Documentation:
- Add PHASE1_COMPLETE.md with comprehensive Phase 1 summary
- Update ROADMAP.md with Phase 1 completion status
- Update .gitignore to exclude database files and documentation

Files changed: 21 files
- New: web/ directory with complete Flask app structure
- New: migrations/ with Alembic configuration
- New: requirements-web.txt with Flask dependencies
- Modified: Dockerfile, ROADMAP.md, .gitignore
2025-11-13 23:59:23 -06:00