US

Data
Aggregation
Hub for Treasury

How we developed a data aggregation hub for treasury that allowed the bank to reduce data errors, eliminate duplicates, and get updates rom the Golden Source dictionary faster.

  • 1 day

    Value updates in the Golden Source dictionary instead of 1 month

  • 100%

    Increase in manual processing time

  • 100%

    Duplication elimination

  • 99%

    Data errors reduction

about
project

Data aggregation hub for the treasury is a high-load secure web-based system that ensures seamless data access, validation, verification, processing, storage, and manual amendment.

about project background image

Business
Challenge

With a growing sensitivity to data collection and protection, banks face a pressing need for powerful and secure systems for data aggregation. Our customer required a system that would arrange data feed from the existing databases and match them against the Golden Source dictionaries. Next, these matched values had to be aggregated and stored for further reference and reporting.

Technical
Challenges

It is known that banks are enterprises with high-load infrastructure. Some of the customer’s data feeds contained over 1M records. Thus, we had to ensure the highest level of performance.

solution delivered img

Solution Delivered

We have developed and implemented ETL processes for the transformation and preparation of data marts for the customer data. The final solution has a new interface for collecting historical and realtime data about the customers from different sources, its aggregation and further transfer to the front system of the bank. To meet the client’s requirements for seamless data aggregation and high performance, we built a high-load secure web-based system with a user-friendly interface. The solution allows amending data manually in case some values don’t match with the Golden Source, for example, new books or counterparties.

Get a free consultation on your next big project now!

Key Features

  • Data access

    With data access, users can store and amend the stored data. We implemented the relevant technology for clear authorization procedures involving different types of data access rights.

  • Data validation

    The system ensures that the required data type undergoes data cleansing while following all the validation rules deployed as a set of stored procedures.

  • Data verification

    The system is empowered with data check procedures to verify the accuracy and consistency of the input data. Proofreading and double-entry check are mainly used to verify data.

  • Data processing

    The system ensures end-to-end data processing (DP), including manual data amendments. It involves data validation, aggregation, and reporting, along with other fundamental functionalities.

  • Data storage

    Now, the customer can be sure that the important information is kept safe. A user-friendly interface was developed to provide users with intuitive experience for data storage and processing processes.

  • Manual data amendment

    Users can retrieve data manually and get system notifications of all the modifications done. The ability to manually check and edit data reduces the number of errors.

Value delivered

  • 99%

    Data errors
    reduction

    Sophisticated data validation and data verification processes realized in the system allowed to reduce the number of data errors by 99%.

  • 100%

    Total
    duplication
    elimination

    Duplicate records are now effectively encountered in the system, which allows for eliminating errors and data silos.

  • 100%

    Increase of
    manual processing
    time by 100%

    In cases when data values don’t match with an allocated data set, users can manually amend those.

New values were updated in the Golden Source dictionary within 1 day, instead of 1 month

Client
Information

Under NDA institution of one of the Gulf region countries

Let’s get in touch

Name

Email

Phone

How can we help you?

Message

Thank you! We will contact you soon

To top of page
To top of page