Console Coins: Round-Up Savings with Investec and GCP

2026-03-04


Console Coins - automated round-up savings architecture on GCP

Every time I tap my card, the spare change moves itself into savings. No willpower required.

I've always liked the idea of round-up savings -- apps like Acorns and Revolut do it, but I wanted to own the pipeline. I bank with Investec, they have a programmable banking API, and I already run workloads on GCP. So I built Console Coins: an event-driven system that listens for card transactions, calculates the round-up, and automatically transfers the difference to my cash management account.


Why build it yourself when apps exist?

Honestly? Because I could. Investec's Open API gives you real access to your accounts -- OAuth2 auth, transaction data, inter-account transfers. And I wanted a project that forced me to wire together Cloud Functions, Pub/Sub, Firestore, BigQuery, and Terraform into something that actually does something useful.

Plus, I get full visibility into every transfer. No black box.


How it works -- the 30-second version

The architecture is two Cloud Functions chained together via Pub/Sub:

  • Function 1 (Orchestration): Receives card transaction events from a Pub/Sub topic, calculates the round-up amount, and publishes transfer instructions to a second topic.
  • Function 2 (Transfer): Picks up those instructions, checks Firestore to avoid duplicate transfers, authenticates with Investec's OAuth2 API, and executes the inter-account transfer.
  • BigQuery: A separate Pub/Sub subscription pushes all transfer data into BigQuery for analytics.

Everything is deployed with Terraform and a GitLab CI/CD pipeline. Dev and prod are separate GCP projects.


The rounding logic -- where the magic happens

This was the fun part to design. I didn't want a flat "round to the nearest Rand" approach. I wanted the round-up to scale with the transaction size:

function customRoundDifference(centAmount: number): number {
  const randAmount = centAmount / 100;

  const roundTo = randAmount < 50 ? 10 : 100;
  const roundedRandAmount = Math.ceil(randAmount / roundTo) * roundTo;

  const transferAmount = Math.round(Math.abs(roundedRandAmount - randAmount) * 100);

  return transferAmount < 100 ? 100 : transferAmount;
}

Below R50, it rounds up to the nearest R10. R50 and above, it rounds up to the nearest R100. If the difference is less than R1, it bumps it to R1; because transferring a few cents felt pointless.

So a R37 coffee saves R3. A R85 grocery run saves R15. A R450 fill-up saves R50. It adds up faster than you'd think.


Idempotency -- the thing I almost forgot

The second function checks Firestore before executing any transfer. Every processed transaction gets recorded by its reference ID. If Pub/Sub retries the message (and it will), the function sees the record and skips it.

const isProcessed = await checkDocumentExists(
  collectionName,
  transactionReference,
);

if (isProcessed) {
  logger.warn(
    `Transaction ${transactionReference} has already been processed. Skipping.`,
  );
  return;
}

I learned this the hard way during testing. Without idempotency, a single retry meant double transfers. Pub/Sub guarantees at-least-once delivery, not exactly-once. That distinction matters when real money is moving.


Talking to the Investec API

The transfer function authenticates using OAuth2 client credentials, then hits Investec's transfer endpoint. The request payload is straightforward -- you provide the destination account, the amount, and references for both sides of the transaction:

const requestData = {
  transferList: [
    {
      beneficiaryAccountId: toAccountId,
      amount: amount,
      myReference: myReference,
      theirReference: theirReference,
    },
  ],
};

Secrets live in GCP Secret Manager and get injected into the Cloud Functions as environment variables via Terraform. No credentials in code, no .env files floating around.

secret_environment_variables {
  key        = "BANK_CLIENT_ID"
  project_id = local.gcp_project_id
  secret     = google_secret_manager_secret.bank_client_id.secret_id
  version    = "latest"
}

Infrastructure as Code -- all Terraform

The entire stack is defined in Terraform: Cloud Functions, Pub/Sub topics and subscriptions, BigQuery tables, Secret Manager secrets, service accounts, IAM bindings. Everything.

One thing I had to work around: GCP doesn't support Cloud Functions in the africa-south1 region yet. So the functions run in europe-west1, while the data layer (BigQuery, storage) lives in africa-south1. The secrets are replicated to europe-west1 to stay close to the functions.

resource "google_secret_manager_secret" "bank_client_id" {
  secret_id = "bank_client_id"
  replication {
    user_managed {
      replicas {
        location = var.gcp_region.europe-west1
      }
    }
  }
}

The CI/CD pipeline in GitLab handles environment switching automatically -- main branch deploys to production, everything else goes to dev.


BigQuery for visibility

Every transfer instruction gets pushed to BigQuery via a Pub/Sub subscription. A view parses the raw JSON into structured columns:

SELECT  STRING(data.accountNumber) AS account_number,
        SAFE_CAST(STRING(data.createDatetime) AS TIMESTAMP) AS create_datetime,
        STRING(data.fromAccount) AS from_account,
        STRING(data.toAccount) AS to_account,
        INT64(data.transactionAmount) AS transaction_amount,
        SAFE.INT64(data.transferAmount) AS transfer_amount,
        STRING(data.transferStatus) AS transfer_status
FROM    CTE
WHERE   UPPER(STRING(data.transactionReference)) <> 'SIMULATION';

That WHERE clause filters out simulation messages I used during testing. Now I can query how much I've saved, spot patterns, and eventually build dashboards.


Dead letter queues -- because things fail

Both Pub/Sub subscriptions have dead letter topics configured with retry policies. If a message fails 5 times, it lands in the DLQ instead of disappearing or retrying forever.

dead_letter_policy {
  dead_letter_topic     = google_pubsub_topic.inter_account_transfers_topic_dlq.id
  max_delivery_attempts = 5
}

retry_policy {
  minimum_backoff = "10s"
  maximum_backoff = "60s"
}

This was one of those "boring but essential" pieces. In production, you need to know when things go wrong without having to watch logs all day.


What I'd do differently

If I were starting over, I'd probably use Cloud Run instead of Cloud Functions for more control over the runtime.


The takeaway

Building Console Coins taught me that the best personal finance tools are the ones you don't have to think about. The system runs, the spare change accumulates, and I only notice when I check BigQuery and see the total climbing. Programmable banking isn't just a buzzword -- if your bank exposes an API, you can build things that genuinely change how you manage money. You just need to be willing to wire it all together.