Risk management data visualization for a capital investment fund

SPsoft optimized the performance and data visualization of existing risk management and trading platform for a Singaporean capital investment fund.

Client

The customer is a capital investment fund based in Singapore that trades in listed stock, futures, options, commodities, and currencies. They concentrate on risk-free trading to ensure the maximum profitability of operations.

  • Location: Singapore
  • Industry: Capital Investment
  • Project period: 2019-2020
  • Team size: 5 people
  • Services: Data analysis, Cloud architecture design and DevOps services, Python scripting, Quality Assurance
  • Techstack: Python/Bonobo, PostgreSQL, Tableau

Delivered value

SPsoft provided a software and data engineering team that helped the customer’s in-house team redesign the architecture of their existing Excel-based pricing engine. This ensured a close to real-time completion of ETL operations, improving the platform performance, and providing better data visualization. As a result, the customer was able to utilize the platform in a more cost-effective way, thus enabling them to operate it on a much bigger scale.

Challenge

The main goal of the project was to redesign the risk management platform’s architecture to enable pricing engine operations in under 5 seconds and update the API integration. SPsoft had to implement data visualization with Tableau instead of an Excel-based custom module, migrate the historical data to new database schemas — and do it all in under 8 weeks.

What’s your challenge? Let us deliver the talent and expertise to help you solve it.

What’s your challenge_

The process

The project was split into two parts: firstly building a data management system, then working on Tableau integration to implement data visualization from scratch. The first step was the discovery phase, where the SPsoft team analyzed the existing data schema and .xls macros for data management. This resulted in designing a data lake for the performant and fast ETL process.

After two or three sessions with the client, the best database structure was defined, represented by Landing, Calculation and Data Marts sections. All data chunks were saved as separate PostgreSQL schemas to ensure data was isolated and there wasn’t any junk in the Data Marts.

The next step involved developing and implementing the Python scripts for ETL solutions and Pricing engine optimization. To achieve improved data visualization capabilities, Tableau was integrated into the Data Marts level to enable ad-hoc reporting and analytics.

The project took under 7 weeks to complete.