Programs Portal & Editorial Review Platform
Web portal where institutions submit, review, and publish educational programmes. Submissions move through a multi-stage editorial workflow; published programmes appear in a faceted public catalogue searchable by subject, level, region, and outcome.
From submission to a searchable catalogue.
Institutions submit programmes through their dashboard. Editors move each submission through a Kanban review and approve, return, or publish. Once published, the programme appears in a faceted public catalogue where visitors filter by subject, level, region, and outcome.
Submit a programme for review
FutureLearn Inst. ยท 4 active programmes
This is an animated mockup of the programmes-portal capability โ not a live product. Institution names and programme titles are illustrative.
Institutional submission portal
Institutions create an account, submit programmes with the required documentation, and track them through the editorial pipeline from their own dashboard.
Editorial Kanban
Editors move programmes through Draft โ Submitted โ Under review โ Approved โ Published, or return them for changes with a comment thread attached.
Approve / return / publish
Editors approve, request changes, or publish. Each transition is logged; submitters see real-time status without an email thread.
Faceted public catalogue
Visitors filter by subject, level, region, and outcome. Programmes carry rich metadata for discovery and reorder live as filters change.
Dedicated search index
The catalogue sits on a dedicated search index โ separate from the editorial database โ so discovery stays fast as the programme count grows.
Programme analytics
Institutions see programme views, applications, and conversions. Editors see submission volume and approval rates across each cycle.
Web portal where institutions submit, review, and publish educational programmes. Submissions move through a multi-stage editorial workflow; published programmes appear in a faceted public catalogue searchable by subject, level, region, and outcome.
Submission portal for institutions, an editorial review workflow with multiple approval stages, and a public catalogue served from a dedicated search index. Each programme carries rich metadata for faceted discovery; the index sits separate from the content store so search stays fast as the catalogue grows.
Institutions create an account, fill in a submission form with required documentation, and watch their programme move through the editorial pipeline: draft โ submitted โ under review โ approved (or returned for changes) โ published. Once published, programmes appear in a faceted public catalogue where visitors filter by subject, level, region, and outcome. The catalogue is served from a dedicated search index so the discovery surface stays fast as the catalogue grows past tens of thousands of programmes.
How a request flows through it
Each request enters at the top of the diagram, flows through every box, and lands at the bottom โ exactly the way the production system behaves. The scan-line traces where a live request would be right now.
What it's built with
The interesting parts
Institutional submission portal
Institutions create an account, submit programmes with full documentation, and track them through the editorial pipeline from their own dashboard.
Multi-stage editorial workflow
Editors approve, request changes, or publish โ each stage logged and routable to the right reviewer. Submitters see real-time status without emailing back and forth.
Faceted catalogue with dedicated search index
Visitors filter by subject, level, region, and outcome. The search index lives separate from the content store so the discovery surface scales independently of submission volume.
Containerised delivery + CI/CD
Dev/prod parity held by containers; releases ship through an automated pipeline so editorial workflow changes go out without manual deploys.
The calls that did most of the work
A handful of engineering choices shape how a system feels. Here are the ones we'd still defend โ alongside what each one cost.
Dedicated search index separate from the content store
Built-in CMS search isn't designed for faceted relevance ranking across thousands of submissions. A dedicated search engine keeps the discovery surface fast and lets the catalogue scale independently of submission volume.
Tradeoff: Two data stores to keep in sync โ content lives in the editorial database, the searchable view lives in the search index. Indexer jobs need to be reliable enough that the public catalogue never drifts from the source.
Containerised dev/prod parity
A small team working across multiple environments avoids the 'works on my machine' class of bugs by treating the container as the unit of deployment. Editorial workflow changes ship without manual deploy steps.
Tradeoff: Local dev gains an extra layer that has to be debugged when something doesn't work as expected, but the savings on environment-drift bugs more than pay back.
Custom workflow over off-the-shelf submission forms
Submission + multi-stage editorial review has domain-specific rules (which reviewers see which stage, what triggers a return-for-changes, how revisions are tracked). Off-the-shelf form modules don't model those constraints โ building the workflow as first-class code keeps the rules close to the domain.
Tradeoff: Custom workflow code carries upgrade cost on every platform-version bump, instead of being a swap-an-add-on operation.
Tell us what you're building.
Free 30-minute call. Real humans, real timelines, no follow-up emails forever.