Key Skills and Expertise:
Client: FlyFroniter
Project: Project Description:
At FlyFrontier, we're undertaking a crucial initiative to migrate our MySQL databases to Snowflake using Azure Data Factory (ADF) pipelines. This migration aims to enhance our data warehousing capabilities, enabling us to leverage Snowflake's scalable and flexible architecture for improved data analytics and insights.
Database Migration Planning: Collaborate with stakeholders to understand the MySQL database structure, data dependencies, and migration requirements. Develop a comprehensive migration plan outlining the steps, timelines, and resources needed for a smooth transition to Snowflake.
ADF Pipeline Development: Design, develop, and deploy Azure Data Factory pipelines to automate the extraction, transformation, and loading (ETL) processes from MySQL databases to Snowflake. Configure data ingestion tasks, data cleansing operations, and data validation routines within ADF pipelines.
Administer the Snowflake environment, including user management, role assignments, and security configurations. Monitor system performance, troubleshoot issues, and optimize Snowflake configurations for efficient data processing and storage.
Collaborate with data architects and modelers to design Snowflake schemas and structures that align with FlyFrontier's business requirements. Implement best practices for organizing data within Snowflake to support analytics and reporting needs.
Conduct thorough testing of ADF pipelines and Snowflake data loads to ensure data accuracy, completeness, and consistency post-migration. Address any data quality issues or discrepancies identified during testing phases.
Document the migration process, ADF pipeline configurations, Snowflake architecture, and best practices for future reference. Provide training and support to FlyFrontier teams on using Snowflake and ADF for data management and analytics tasks.
Foster effective communication and collaboration with cross-functional teams, including database administrators, developers, data analysts, and business users. Provide regular updates on migration progress, issues, and resolution plans to stakeholders.
Client: United States Army Association (USAA)
Project: Consumer Lending
Description:
Consumer Loan Acquisition System & Consumer Lending Origination events systems allow the member to apply for the Consumer Loan product they are Eligible for, provide an automated decision for the submitted applications, and fulfill the process of application by disbursing the loan amount or decline letter in regards to the decision. Consumer Lending origination events Hive will house the streaming data related to GSD loans to produce volume and trending over time reporting for CL product management.
For the Cloud Technology Implementation in USAA, two implementation tracks will be supported by a series of deliverables. These deliverables, which are necessary to upgrade and convert USAA to the new solution, include:
Key benefits include Improved insights into the data, reduces manual effort to produce insights and answer questions from various partners and helpful for making the best point forward decision on the Consumer Lending portfolio.
Client: McKesson IT, US
Project: ETL Modernization
Description:
McKesson is the oldest and largest healthcare company in the US. Marketing, Berlex, XSSH, Natezza downstream file creation jobs, and SLX applications have been managed by me for the OPS activity. Source data is sourced from SAP ECC and other systems, loaded into Oracle Data Warehouse (IW), and sent to multiple downstream provisioning like Marketing, Natezza, etc. The ETL Modernization project aims to modernize McKesson’s legacy ETL architecture by replacing Oracle Data Warehouse (IW), IBM Datastage 7.5.2, Business Objects Data Services, and LPAD (Landing Pad) / DPA (Data Provisioning Area), with SAP BW 7.4, HANA 1.0, HADOOP, and IBM InfoSphere 11.5.
Responsibilities:
Client: Lloyds Banking Group
Project: Risk Data Warehouse
Roles & Responsibilities: