top of page
Search

Case Study: Transforming Treasury Reporting with Robotic Process Automation (RPA)

Background

A global banking group’s capital management function faced significant operational challenges due to a rapid expansion in its responsibilities, low levels of governance and an IT infrastructure that was not fit for purpose.

The function oversaw the booking, management and reporting of bonds issued by the bank as well as reporting its capital position to the regulatory authorities and senior management.

The function’s responsibilities had grown rapidly over a three year period but there was limited investment in systems and processes, resulting in staff creating a plethora of End User Computing (EUC) solutions to aid their work. This situation resulted in a number of issues, detailed below.

 

  • Fragmented Data Sources: The treasury team grappled with data scattered across various systems, hindering efficient decision-making. Reports were not directly generated from core systems; instead, data had to be extracted from multiple sources and manually combined.

  • Excel-Driven Complexity: Over the years, the treasury function had developed numerous complex EUC solutions. These spreadsheets, some containing macros, were critical for financial analysis, risk assessment, and liquidity management. However, their intricate nature made them difficult to maintain and understand.

  • Lack of Documentation: The EUCs lacked proper documentation, leaving users uncertain about their operation. Auditing each EUC was time-consuming, and controls were inadequately evidenced.

  • Staff Morale and Efficiency: Highly educated treasury professionals spent long hours performing repetitive tasks using Excel. The resulting lack of job satisfaction and morale affected productivity.

  • Reporting Challenges: While Excel is flexible for ad hoc analyses, it falls short for regular, repeatable reporting and the treasury team struggled to provide accurate and timely reports.

  • Audit Concerns: Internal audits revealed errors that users found challenging to explain due to the undocumented intricacies of specific EUCs.

  • System Upgrades: Prospective upgrades to formal systems were complex, expensive, and time-consuming. Even minor updates took months to implement.

 

Management acknowledged that there was no available funding for the necessary IT systems required to address the above issues and that any formal IT solution would in any event take at least two years to implement due to the complexity of the function’s processes. As a result, it was decided that a pilot project would be undertaken to determine if RPA could alleviate the issues.

The scope of the pilot project covered the daily Profit and Loss (P&L) report for the global banking group’s issued debt portfolio. This report tracked movements in the value of existing bonds and swaps, cash transactions, and new deals.


Existing Situation

The report relied on an End User Computing (EUC) spreadsheet, manually operated based on user knowledge, which involved several steps to complete. The reporting process was performed by team members located in India and the UK. The output was distributed via email to a wide range of stakeholders, including the Group CFO.

The report used the following data sources:

  • Transaction Data: Derived from corporate systems, with separate data sets for foreign exchange (FX) deals and bonds.

  • FX Rates: Obtained via email (Excel table).

  • Bank Statements: Extracted from PDF documents.

  • Static Data: Account codes maintained in an Excel spreadsheet, updated manually.

  • New Deals: These needed to be manually identified and incorporated into the spreadsheet.


A variety of concerns were identified with the existing reporting process:

  • Time-Consuming: It took 1-2 hours per day to compile the report.

  • Extensive Data Manipulation: A number of manual operations were required as part of the process in order to format data into a standardised table.

  • Poor Documentation: The process has limited documentation and the operation of controls was not recorded.

  • Inadequate Analysis: Due to time pressures, the team did not always have time to investigate what was driving the P&L movements, leading to management frustration.

  • Errors: Occasional errors were occurring, especially when inexperienced team members were tasked to run the report.

Addressing these challenges would be crucial to enhance the efficiency, accuracy, and transparency of the reporting process.


Approach

A thorough analysis was required guide the automation strategy and help design an efficient, error-free process. The following steps were taken to analyse the reporting process:

 

  1. Review of Process Documentation:

  • A thorough examination of existing process documentation to identify gaps, inconsistencies, and areas needing improvement.

  • Update so that the documentation accurately reflects the current process steps and components.

  1. Interviews with Personnel Performing the Process:

  • Engagement with the individuals directly involved in the process to understand their day-to-day activities, pain points, and challenges.

  • Gathered insights on any manual workarounds, bottlenecks, or inefficiencies encountered.

  1. Interviews with Management:

  • Conversations with management stakeholders to grasp their expectations and requirements from the process.

  • Gain an understanding of their concerns related to quality, timeliness, compliance, and overall effectiveness.

  • Documentation of their feedback and priorities.

  1. Breakdown of the Process:

  • Decomposition of the entire process into granular steps to identify subprocesses and dependencies.

  • Creation of a process flow to visualise the sequence of activities.

  1. Identification of Periodic Aspects:

  • Determined the frequency of the process elements (e.g. daily, monthly, annually).

  • Identified any specific steps that occurred on a regular basis (e.g. month-end and year end activities).

  1. Documentation of Process Components:

  • Data Sources and Formats:

  • Compilation of a list of all data sources (transaction data, FX rates, bank statements etc.), specifying the format of each data source (Excel, PDF, database, etc.)

  • Systems Used:

  • Identification of the software systems or tools involved (corporate systems, email clients, spreadsheets etc.), noting any integrations or interfaces between systems.

  • Manual Actions Required:

  • Documentation of all manual elements (data entry, reconciliation, approvals), highlighting areas where human intervention was necessary.

  • Controls and Approvals:

  • Identification of control points (validation checks, sign-offs, reconciliations).

  • Understanding the approval hierarchy and any compliance requirements.

  1. Assessment of Ease of Automation:

  • Data Availability:

  • Evaluation of  whether all required data is consistently available.

  • Consideration of data gaps and/or delays.

  • Data Retrievability:

  • Assessment of the ease of extracting data from various sources.

  • Investigation of any limitations (e.g. APIs, data connectors).

  • Data Quality:

  • Analysis of data accuracy, completeness, and consistency, addressing any data quality issues.

  • Systems Access:

  • Check to see if necessary systems had APIs or automation capabilities.

  • Ensuring that proper permissions for data retrieval and updates were in place.

  • Required Manual Interventions:

  • Identification of steps that could not be fully automated (e.g. manual approvals).

  • Articulation of how to handle such manual interventions effectively.

 

Solution

The pilot project used a vendor tool that was already in use elsewhere within the bank. The model used was to train users as “citizen developers” to create automation configurations using the tool. Training took around three days to complete and equipped users with the basics to start creating their own configurations.


This tool enabled an RPA bot to be created that could:

  • Request, extract and read and output data in multiple formats (Excel, .csv, PDF, text etc.). from disparate systems, normalise it, and consolidate it into a central repository.

  • Perform specified calculations, sort and format and summarise data similar to Excel.

  • Create detailed documentation for each RPA process. Audit trails and controls were embedded within the bots, providing transparency and compliance.

  • Absorb static data updates without users needing access to the vendor tool.

  • Allow users that were trained as citizen developers to make changes to the RPA bot configuration, test these and release them into production.


A number of challenges were faced, namely convincing management to free up staff time so they were able to undergo the training but also enabling the RPA bot access to systems where the standard procedure to gain access required an HR number. A further challenge was on maintaining data security for restricted data sets.

Whilst the automation tool was not the most intuitive to use, it was very powerful and the resulting RPA bot was configured to:

  • Read emails containing .csv files containing the previous day’s trade and position data that were sent from the trading system to the RPA bot’s mailbox and combine them into one table.

  • Read a table holding the previous day’s FX rates and incorporate these into the data table.

  • Look up the correct accounting codes and add these to the position records in the table.

  • Calculate the day’s P&L from the balance sheet and cash movements compared to the previous day’s data.

  • Produce exception reports for any new deals where accounting codes needed to be added or any other data was missing.

  • Output an Excel spreadsheet holding the detailed P&L together with a summary table to a specified directory in order that the P&L could be reviewed by a staff member.

  • Prepare an email to specified recipients with the day’s summary P&L attached, for the supervisor’s approval.

  • Absorb any amendments made by the reporting team and output the revised results.

  • Any changes to static data such as account codes, transaction groupings etc. could be made by simply updating the relevant Excel table and dropping this into a specified directory.

  • As the RPA output and any static data changes could be performed using email and Windows Explorer it wasn’t necessary for any user to have access to the RPA tool or configuration.

 

Results

As a result of the success of the pilot project, further work was undertaken with the following results.

 

  • EUC Automation: Complex EUCs were gradually replaced by RPA bots. These bots replicated the existing processes, ensuring accuracy and consistency. Macros were translated into RPA workflows.

  • Task Offloading: RPA took over mundane tasks, freeing up treasury professionals to focus on strategic analysis, risk modelling, and decision-making.

  • Automated Reporting: Regular reporting became seamless. RPA bots generated standardised reports, reducing turnaround time and enhancing data accuracy.

  • Agile Upgrades: RPA allowed for agile adjustments. Even minor system updates were implemented swiftly, avoiding lengthy lead times.

  • Efficiency: Manual efforts reduced significantly, leading to improved staff morale and job satisfaction.

The work resulted in a number of benefits.

  • Accuracy: Errors decreased, and audit trails provided evidence of controls.

  • Timeliness: Reports were delivered promptly, aiding decision-making.

  • Cost Savings: RPA eliminated the need for expensive system upgrades.

  • Scalability: As the treasury function evolved, RPA easily adapted to new processes.

  • Work Satisfaction: Staff were able to spend more time on analysing the data than producing and checking it. 


In summary, RPA transformed the capital management function’s operations, turning mundane tasks into automated workflows and empowering treasury professionals to focus on value-adding activities.

 

 
 
 

Comments


bottom of page