Data Aggregation Hub for Treasury

  • 99%

    99% data errors reduction

  • 100%

    100% duplication elimination

  • 1 day

    Value updates in the Golden Source dictionary within 1 day

  • 100%

    100% increase in manual processing time


Data aggregation hub for treasury is a high-load secure web-based system that ensures seamless data access, validation, verification, processing, storage, and manual amendment.


Under NDA


With a growing sensitivity to data collection and protection, banks face a pressing need for powerful and secure systems for data aggregation. Our customer required a system that would arrange data feed from the existing databases and match them against the Golden Source dictionaries. Next, these matched values had to be aggregated and stored for further reference and reporting.


It is known that banks are enterprises with high-load infrastructure. Some of the customer’s data feeds contained over 1M records. Thus, we had to ensure the highest level of performance.


To meet the client’s requirements for seamless data aggregation and high performance, we built a high-load secure web-based system with a user-friendly interface. The solution allows amending data manually in case some values don’t match with the Golden Source, for example, new books or counterparties.

Solution delivered img

Claim a 30-minute talk with our experts and get a step-by-step strategy for your project for free!

Key features

key features image
  • Data Access

    With data access, users can store and amend the stored data. We implemented the relevant technology for clear authorization procedures involving different types of data access rights.

  • Data Validation

    The system ensures that the required data type undergoes data cleansing while following all the validation rules deployed as a set of stored procedures.

  • Data Verification

    The system is empowered with data check procedures to verify the accuracy and consistency of the input data. Proofreading and double-entry check are mainly used to verify data.

  • Data Processing

    The system ensures end-to-end data processing (DP), including manual data amendments. It involves data validation, aggregation, and reporting, along with other fundamental functionalities.

  • Data Storage

    Now, the customer can be sure that the important information is kept safe. A user-friendly interface was developed to provide users with intuitive experience for data storage and processing processes.

  • Manual Data Amendment

    Users can retrieve data manually and get system notifications of all the modifications done. The ability to manually check and edit data reduces the number of errors.


  • Reduction of data errors by 99%

    Sophisticated data validation and data verification processes realized in the system allowed to reduce the number of data errors by 99%.

  • Total duplication elimination

    Duplicate records are now effectively encountered in the system, which allows for eliminating errors and data silos.

  • Increase of manual processing time by 100%

    In cases when data values don’t match with an allocated data set, users can manually amend those.

  • 1-day updates

    New values were updated in the Golden Source dictionary within 1 day, instead of 1 month.

Our Tech Stack

  • Angular icon Angular
  • Oracle icon Oracle
  • NodeJS icon NodeJS
  • PL/SQL icon PL/SQL

Trusted partner who delivers on scope, on time and on budget

contact us




Become a customer


Thank you! We will contact you soon

To top of page
To top of page