Managing finances can be a daunting task, often making you jump through multiple tabs or apps. We're on a mission to make money work for everyone at Monzo, and helping our customers understand their finances is a key part of this.
Enter the unified feed: a single stream of activity on the home screen that lets you glance over the latest updates across all your bank accounts! Less searching around, less app-hopping, less confusion.
In this blog post, we explore how we built this new product that benefits millions of customers each day.
How feeds work at Monzo
The feeds are our most used flow within the apps and mostly consist of transactions and other events ordered by time. The foundation of the feed system is largely made up of two microservices that date back to 2015: the feed API and the feed service. The system also contains other services responsible for generating feed items but for this post we’ll focus on the read path.
The app (the frontend) requests every feed item between two dates.
This request makes it to the feed API, which asks for the customer's feed items from the feed service. The items in the feed service usually reference particular transactions or upcoming payments, but they can also include text and links to display static information.
The API enriches this data with transactions, merchants and other information from downstream services before returning to the frontend.
The frontend takes the raw data from the response and decides how each feed item should be displayed.
While this approach served us well for many years, it presented some challenges as the bank expanded.
One problem we started seeing was seemingly random out-of-memory errors within our Kubernetes pods that host the API service. We later found out that some customers had so many items within the time windows requested that it would cause the memory usage to dramatically spike. We could immediately solve this by increasing each service’s allocated memory and reducing the time ranges requested by the apps, but this wasn’t a sustainable solution.
Another limitation was that display changes had to be made on every frontend platform. Just changing an item’s icon would require an iOS and Android engineer to make their respective changes, which our customers would only see after the next release cycle (potentially weeks away).
The nearly grade II listed feed system was feeling a bit dated and it underwent its first major renovation when we introduced the dynamic feed service.
What makes a feed dynamic?
The new service sits between the feed API and the feed service. Our frontend calls a new endpoint in the API service that performs authentication and authorisation before routing it to the dynamic feed. This service is now responsible for fetching the feed items and hydrating all the information needed to render them.
The single biggest change with the new service was the use of a backend-driven approach. The backend now decides how each item should be displayed rather than just sending the raw data. This means that once our frontends support the display model components (a one-off investment), rendering changes can be made once in the backend and seen instantly by all our customers. The approach also extends to building new feeds – which is now possible in hours versus the weeks required before.
Another significant change was the addition of pagination, allowing us to request a feed in chunks known as pages. We were able to set a limit value that decided the maximum number of items we could return in each page request. The limit lets us set a more reasonable memory request for the pods by preventing the large spikes seen before.
We migrated the years worth of display logic into the new service, creating a range of new dynamic feeds. These include the primary (that powers the personal, joint and business account), pot, and Flex feed types. We also built the transaction feed which isn’t used directly in the apps anywhere but is a base used internally by other feeds to ensure that any display changes only need to be made in one place. This concept was vital for the creation of the unified feed, which could reuse all of the logic we had already written.
So, now that we’ve set the scene, how does the unified feed work? The unified feed reads the customer's accounts to understand what feeds to join. It then creates all of the sub-feeds and pages them in parallel. The results are sorted and the first limit items are kept. These items are then hydrated to fetch information from other services and marshalled to the final display model for the frontend.
We don’t know which items will make up each page of the unified feed until we request them all, resulting in some wasted database reads. We were concerned about how this would impact the cost and platform load as we knew the new feed would receive thousands of requests per second. Another idea was to maintain an index which would have a similar role to the feed service but for the unified feed. When we create a personal account feed item, we would emit a Kafka event that would be consumed and written to the unified feed index. We did some data modelling to predict the cost of both approaches and calculated that indexing would take over five years to break even. This break-even point also didn’t consider the engineering effort to backfill the billions of existing feed items or the indirect cost of high complexity – more of our engineer’s time spent on support and maintenance.
Bringing clarity, not chaos
At this point, it may seem that this problem isn’t really that difficult, we just list all of the existing feeds and join them together, right? However, the unified feed was not only about joining all of your activity together but also simplifying it to make it digestible at a glance.
We introduced the idea of customisers within the unified feed, a function that's given the entire page of items which it can decide to change. This isn’t set by default so the feed continues to render items in the standard way. However, if another feed calling the primary feed wanted to make some tweaks, it could pass in a customiser function that would wrap the default logic, changing the title or anything else you could need. This meant we could still reuse almost all of the existing logic in the feeds being unified while also applying some little touches to make it unique.
Account badges
If you have a personal and joint account, it may not be immediately clear which account your transactions in the feed belong to. To avoid this confusion in the new feed, we added account badges to allow our customers to understand which accounts their activity belongs to. However, providing clarity isn’t about being explicit in every detail but only when it is valuable. For this reason, if you only have a personal account, you will not see any account badges, as they would only act as visual clutter.
While paging their feed, we look at the customer’s accounts to decide whether we want to include account badges. If we do, then we use our customiser function to wrap the underlying feed items with some code that can work out the account icon and then apply it to the item.
Aggregated upcoming payments
It’s common to have many bills paid around the same time each month, and showing these on the home screen is important so that our customers know what they will be charged and when. This can take up a lot of screen space for those unfortunate enough to have lots of bills! We simplified this within the unified feed by aggregating all upcoming and outgoing items, as shown below. This saves screen space and the need for ad-hoc Countdown number rounds when managing upcoming payments. If you want to see each payment, you can click on the aggregated item, and they will be listed individually within the full-screen feed view.
We define two aggregated items, one incoming and one outgoing, statically in the feed with a fixed date in the future. This means that if we ask for a page of items that crosses into the future then these new items will be included. The items will request the upcoming payments for each account and then decide how they should be displayed. If there are no upcoming payments then both items are hidden from the response to the frontend.
Deduplicating transfers
Two feed items are created when transferring money from your account to your pot. One belongs to your account, describing the outgoing side and the other belongs to the pot and shows the incoming side. If we merged these feeds, then each transfer would create two feed items. An obvious solution may be to search the current page of items and hide any duplicates, but this approach won’t work where the boundary of the current page is between two duplicate items. We instead apply some logic that can understand the source and destination of the transfer from each feed item that makes it up. We then only decide to keep one item per transfer, which is customised to represent the entire transfer. We've since extended this approach to handle over 40 different types of transfers!
Unifying our customers’ financial lives
One of the newer improvements within the feed is the ability to include open banking data. Open banking is a regulation that defines a range of APIs to allow banks to communicate information with each other. If you connect another bank account to Monzo then we can display the transaction information provided by the bank. Why can’t we just unify this too? The greatest challenge we faced was maintaining the order of transactions between each bank. If you bought a coffee in the morning with Monzo and some lunch with your other bank card then you would expect those items to be in the same order in the app. The problem we faced was that some banks only processed the transaction at midnight, despite the payment being made at any other point in the day. To help us with this, the open banking squad added transaction time ranges which estimate when we believe the payment was made rather than processed. It does this by applying some heuristics based on when we first saw the transaction and the timestamp provided. We tested the change in detail, firstly with staff, then our community and then within an experiment. We saw no harm in engagement or feedback about the ordering of items. Although this wasn’t the most technically challenging feature to build, it’s definitely one of the most important to fulfilling the vision of the feed.
Releasing safely to millions of customers
Rolling out a significant change to millions with no downtime is no small feat. We bundled the release of the unified feed with the new-look home screen to not duplicate the effort. Our biggest risk was causing platform instability by increasing requests to the new feed by 1000x, a particular concern as each request triggers an avalanche of downstream calls.
We could try to trace which services some requests end up calling but this depended on which accounts and feed items were included in each page. We had to perform real-world testing to understand how the load would truly impact our platform. To do this, we used shadow traffic, which sounds as cool as it was useful! We updated the apps to call the new endpoints while not displaying the response. We then slowly rolled out the changes, which meant we could gather production data while also allowing us to reverse changes without any customer impact. This cautious approach, stretching over months, allowed the platform to adjust to the new demands and for us to make any necessary optimisations. Once we had nudged the feed out of the shadows, the result was... amazingly silent! We were very relieved to have finally shipped the new products.
The unified feed is a great example of different functions within Monzo working together to create a delightful experience. From design, product and user research folks evolving the idea to debugging pagination bugs with app engineers and optimising feeds with backend engineers across collectives. Although there were many challenges in delivering this product, it’s incredibly rewarding to know that it is helping millions of customers make money work for them.
Come work with us!
If this is something that peaks your interest, check out our careers page or take a look at the roles below 👇