Gradient

Getting to a Single Source of Truth

Structured, SOX compliant, multi-layer data lake

Practice Areas

  • Data Engineering
  • Data Lakes

Business Impacts

  • Dramatically reduced operational costs
  • Timely, accurate data for critical finance department functions

Challenges

  • Compliance
  • Auditing capability
  • Large number of source systems
  • Scalability

Technologies

  • HVR
  • Talend
  • Hadoop
  • Oracle

Summary

Large enterprises acquiring companies often face challenges integrating their financial systems. Over the last several decades, our client, a global enterprise, has acquired dozens of businesses, each with its own ERP system and data warehouse solution. By operating dozens of data warehouses, our client incurred license, equipment and personnel costs.

To improve performance and reduce costs, our client launched an ambitious project to consolidate these disparate data warehouses into a single data lake for its financial data. This presented many technical, operational and security challenges.

Our client reached out to Starschema to design and implement a highly performant, SOX compliant and secure data lake solution.

Challenge

Every day, our client's companies record tens of thousands of financial transactions. This poses a unique challenge for a data lake:

  • Highly granular data, usually transaction-level, needs to be ingested in near real-time (latency <1 hour)
  • Over 100 source systems belonging to more than thirty different types
  • The solution needs to comply with Sarbanes Oxley (SOX) requirements
  • Zero tolerance for data inconsistencies
  • 200TB of enterprise data comprising more than 25,000 data domains (tables) of over five hundred distinct types
  • Provide simultaneous data consumption and continuous ingestion

Solution

Starschema implemented a state-of-the-art architecture by deploying the Starschema Antares iDL™ design, in which raw data would first be mirrored by ingestion into a Massively Parallel Processing (MPP) relational database using HVR and Talend. Subsequently, data would be replicated to in-memory and Hadoop (file system) based consumption layers for later use, including aggregation, data stores, and data science applications.

Data consumption then takes place over a dynamic lambda architecture, providing streaming and batch processing layers. To facilitate the operation of this multi-layer data lake, a standard data definition structure (standard model) was devised for identical domain types of raw data, and a metadata knowledge base was used to store discovered constraints and relationships within the data. At this stage, a standard data model was devised for identical domain types of raw data.

In addition, the data lake automatically generated a continuously updated metadata knowledge base to store discovered constraints and relationships within the data, providing a comprehension of the data lake’s underlying structure itself. This in turn drives the Generic ETL Framework (GEF) and Data Lake Audit Framework (DAF) that constantly maintains and audits the data layers. Operations, change management, and development are supported by ITIL and SOX compliant DevOps CI/CD pipeline applications, which maintain compliance through a process-forcing design.

Results

Every day, the SOX certified the Finance Data Lake ingests approximately 200TB data through 6,000 parallel ETL processes from a diverse range of source systems – Oracle, SAP, PeopleSoft, Hyperion, enterprise-developed systems, etc. – into a single Oracle EBS type Standard Model.

Data is ingested in near real-time, allowing the enterprise to perform crucial finance functions, such as closing and reporting, account reconciliation and centralized tax calculation, based on accurate, consistent and up-to-date data.

Enterprise-Scale AWS Cloud Migration

Our client, a Fortune 50 energy company needed to migrate all their existing data and analytics platforms to Amazon Web Services and redesign the existing architecture to leverage AWS-native technologies. This was an initiative to optimize performance and reduce long-term operation costs by taking advantage of recent advancements with AWS cloud-native solutions.

Oracle to AWS Data Platform Migration

When large organizations merge or divest, then new entity has business transformation thrust upon it. Our client, a provider of power generation solutions, divested from a larger organization to become a brand-new company. With the looming deadline of a cut-over from the existing data platform, the client engaged Starschema to validate and test the existing technology stack and determine the best way to move forward.

Starschema Antares iDL™

A fully automated, compliant-by-design intelligence data lake architecture with real-time ingestion and best-of-breed standardization and audit features.

Large-Scale Data Replication Deployment

Our client, a global manufacturer in the power generation industry, faced challenges with an aging reporting environment based on multiple Oracle ODS systems and reporting straight from source databases.