Skip to main content
Warning This document has not been updated for a while now. It may be out of date.
Last updated: 4 Sep 2023

router: ADR 001 - How we migrate Router/Router-API from Mongo to Postgres

Context

We need to migrate router (and router-api) off of MongoDB as its datastore. The majority of GOV.UK platform is on PosgreSQL and there is no longer a strong enough case for being on MongoDB for router data.

Even the content-store is migrating from MongoDB to PostgreSQL.

This ADR is to outline our decisions and rationale on the steps we take for that migration:

Decision

  • Spin up a completely separate stack of router and router-api (govuk-docker first) that will talk to a PostgreSQL db only (as forks)
  • Provision in EKS a new PostgreSQL db
  • Provision the new p-router/p-router-api apps
  • Amend messages to router-api to be duplicated to p-router-api within content-store

Benefits of doing it this way

  • As updates are being made to the two databases, we can backfill data previous to that point in time without loss of service or downtime to the live (old) service
  • We can test traffic going to the new p-router stack insolation
  • Switchover would mean a change in DNS
  • We can rename p-router once we decommission old router

Risks

  • We never rename p-router
  • DNS messes up and caching gets in the way, causing downtime to requests trying to reach origin

Alternatives

  • We update router and router-api and deploy the changes to the existing version

Risks

  • Big bang
  • Running 2 databases under 1 app
  • Double code/logic to handle that
  • Harder to clean up

Status

Accepted

Reverted

On recommendation from the Tech Lead of the Platform Engineering team

For simplicity, we came to the conclusion that assuming that the separate postgres version of router-api is running in the background already, and is fully up to date (as in, all changes before it went "live" are synced, plus anything new after the fact), then there is no reason for Router to not just be delivered as normal.