MCU Flowchart is a public-friendly monorepo for exploring Marvel screen media as structured metadata. It combines a FastAPI backend, a Next.js frontend, and a JSON dataset that powers media pages, watching-order guidance, and relationship graphs.
The project is built around community-maintained metadata. If a title is missing, a relation looks wrong, or you disagree with how much context is required before watching something, contributions are welcome.
You can contribute by opening an issue or pull request for:
- new media entries in
dataset/data/media/; - poster references in
dataset/data/posters/; - universe and saga metadata in
dataset/data/universes/anddataset/data/sagas/; - connection updates in a media file's
connections.required,connections.optional, orconnections.referencesarrays; - corrections to summaries, release dates, phases, sagas, or universe assignments.
Before opening a PR, run the dataset validator:
python scripts/validate_data.pyWhen changing relations, prefer a short reason explaining why the connection exists. Use required only when the linked media is important to understanding the story; use optional for helpful context; use references for callbacks, cameos, easter eggs, or lighter continuity links.
apps/api: FastAPI service that validates and serves the dataset.apps/web: Next.js application for browsing media and graph views.dataset: JSON metadata, schemas, posters, universes, and sagas.scripts: validation and smoke-test utilities..github/workflows: deployment and release automation.
apps/
api/ # FastAPI backend
web/ # Next.js frontend
dataset/
data/
media/ # One JSON file per title
posters/ # Poster assets referenced by metadata
sagas/ # Saga metadata
universes/ # Universe metadata
schemas/ # JSON schemas
scripts/ # Validation and smoke scripts
docker-compose.yml # Local full-stack runtime
- Python 3.11+
- Node.js 24+
- Docker and Docker Compose, optional but recommended for local full-stack runs
From the repository root:
docker compose up --buildServices are exposed at:
- Web app:
http://localhost:3001 - API:
http://localhost:8001 - API docs:
http://localhost:8001/docs
Stop the stack with:
docker compose downcd apps/api
python -m venv .venv
source .venv/bin/activate
python -m pip install -r requirements.txt
python ../../scripts/validate_data.py
python run.pyOn Windows PowerShell, activate the virtual environment with:
.\.venv\Scripts\Activate.ps1cd apps/web
npm install
cp .env.example .env.local
npm run devThe frontend runs on http://localhost:3001 and proxies /api/* requests to the API.
Each file in dataset/data/media/*.json describes one title. Required fields include:
id: stable unique identifier;title: display title;release_date: ISO date,YYYY-MM-DD;universe: one of the universe ids defined by the schema;mediatype:movie,show, orspecial;poster: poster URL path, usually/posters/<id>.jpg;summary: non-empty description;connections: relation groups forrequired,optional, andreferences.
Connection objects can target either a media_id or a saga_id, and may include:
reason: human-readable explanation;importance: number between0and1.
See dataset/README.md and dataset/schemas/media.schema.json for the full data contract.
Run from the repository root:
python scripts/validate_data.pyRun API tests:
cd apps/api
python -m pytest -qRun frontend checks:
cd apps/web
npm run lint
npm run buildRun an API smoke test from the repository root:
python scripts/smoke_api.pyThe repository includes GitHub Actions workflows for:
- building and publishing Docker images to GitHub Container Registry;
- deploying the API and web services from
stablewithdocker-compose.server.yml; - regenerating release notes so stable releases list the feature PRs merged into
main.
GITHUB_TOKEN is provided automatically by GitHub Actions for GHCR authentication. A self-hosted Linux runner with Docker access is required for the deployment workflow.
- API details:
apps/api/README.md - Frontend details:
apps/web/README.md - Dataset details:
dataset/README.md