4.2 KiB
CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
Commands
First-time setup:
cp .env.example .env # fill in SECRET_KEY, DB_PASSWORD, OAuth credentials
Run the app (only way to test — no host Python):
docker compose up --build
Database migrations:
# Applied automatically on every `docker compose up` via entrypoint.sh.
# To create a new migration after changing models.py (DB must be running):
DB_PASS=$(grep DB_PASSWORD .env | cut -d= -f2)
docker run --rm -v "$(pwd)":/app -w /app \
-e FLASK_APP=app -e DATABASE_URL="postgresql+psycopg://ballistic:${DB_PASS}@db:5432/ballistic" \
-e SECRET_KEY=dev --network ballistictool_default --entrypoint flask \
ballistictool-web db migrate -m "description"
# Then restart so entrypoint applies it:
docker compose up --build -d
Important:
docker compose run web flask db ...won't work for init/migrate because the container is ephemeral and writes to its own filesystem. Use the bind-mountdocker runform above so files persist to the host and get committed to git.
Smoke-test imports without starting the DB:
docker compose run --no-deps --rm --entrypoint python web -c "from app import create_app; create_app()"
The host Python environment is externally-managed (Debian). Do not run
pip installon the host. All dependency testing must happen inside Docker.
Project structure
app.py — create_app() factory; registers extensions, blueprints, and core routes
config.py — Config class reading all env vars (SECRET_KEY, DATABASE_URL, OAuth keys)
extensions.py — module-level db/login_manager/migrate instances (no init_app here)
models.py — SQLAlchemy models: User, EquipmentItem, ShootingSession, Analysis
storage.py — file I/O helpers: save_analysis(), save_equipment_photo()
blueprints/ — feature blueprints (auth, dashboard, analyses, equipment, sessions)
migrations/ — Alembic migration scripts (committed to git)
.env — gitignored secrets (copy from .env.example)
entrypoint.sh — runs `flask db upgrade` then starts gunicorn
Architecture
Flask web app that processes ballistic CSV data, computes statistics, renders charts, and generates PDF reports.
Request flow for POST /analyze:
Upload CSV
→ analyzer/parser.py parse_csv() — normalize CSV (handles locale variants)
→ analyzer/grouper.py detect_groups() — split into shot groups by time gaps
→ analyzer/stats.py compute_*_stats() — per-group + overall statistics
→ analyzer/charts.py render_*_charts() — base64 PNG images via matplotlib
→ analyzer/pdf_report.py generate_pdf() — fpdf2 multi-page PDF (returned as bytes)
→ templates/results.html — renders stats + embedded images + PDF link
Group detection algorithm (grouper.py): splits shots where the gap between consecutive timestamps exceeds median_gap × OUTLIER_FACTOR (5). This is the core domain logic that determines what counts as a separate shooting session.
CSV parsing (parser.py): handles BOM, various decimal separators (. / ,), and time formats (HH:MM:SS, HH:MM:SS.fff, HH:MM:SS,fff). Expected columns map French headers to internal names: speed, std_dev, energy, power_factor, time.
Charts use matplotlib's Agg (non-interactive) backend. Images are base64-encoded and embedded directly in HTML and PDF — no static asset serving.
PDF is returned as raw bytes from generate_pdf() and served inline via Flask's send_file.
Stack
- Python 3.12, Flask 3.0, gunicorn (2 workers, port 5000)
- PostgreSQL 16 via Docker Compose; SQLAlchemy 2.0 + Flask-Migrate (Alembic) for ORM/migrations
- DB driver:
psycopg[binary](psycopg3) — connection URL scheme ispostgresql+psycopg:// - Auth: Authlib (OAuth2 flows) + Flask-Login (session/
current_user); providers: Google, GitHub - File storage: Docker volume at
/app/storage; Pillow for equipment photo validation/resize - pandas + numpy for data processing; matplotlib for charts; fpdf2 for PDF generation
- Docker / Docker Compose for deployment (no host install)