Summary
Overview
Work History
Education
Skills
Timeline
Generic

ASHIK RAPAKA

Software - Techinical Lead
Hyderabad

Summary

  • Over 11+ years of IT experience specializing in designing and leading Enterprise Data Warehouse, data mart, and BI solutions within Banking, Medical, and Airline domains.

Key Skills and Expertise:

  • Data Integration and ETL:Extensive expertise in data extraction, transformation, and loading (ETL) from diverse sources such as Oracle, SQL Server, MySQL, Netezza, and Flat files using DataStage and UNIX shell scripting.
  • Cloud Migration:Demonstrated proficiency with over 4+ years of experience in migrating data from on-premise databases like Netezza/DB2 to Cloud platforms, particularly Snowflake.
  • Data Modeling:Practical understanding of Data modeling concepts including Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables, with hands-on implementation of Slowly Changing Dimensions - Type I & II in Dimension tables.
  • Tool and Platform Migration:Experience in migration projects involving upgrades from DataStage 9.1 to DataStage 11.5, conversion of ETL tools from Microsoft SQL Server Integration Services (SSIS) to DataStage 11.5, and database migration.
  • Team Leadership:Proven ability to lead technical teams of up to 15 developers across multiple projects simultaneously, ensuring successful project delivery and client satisfaction.
  • Cross-Functional Collaboration:Skilled in managing tasks and assignments concurrently in cross-functional and global environments, adept at effective communication and collaboration.
  • Leadership and Customer Interaction:Strong leadership skills with a track record of effectively handling customer interactions and fostering positive client relationships.

Overview

12
12
years of professional experience

Work History

Snowflake Admin & Developer

GAVS Technologies Private
04.2023 - Current

Client: FlyFroniter
Project: Project Description:
At FlyFrontier, we're undertaking a crucial initiative to migrate our MySQL databases to Snowflake using Azure Data Factory (ADF) pipelines. This migration aims to enhance our data warehousing capabilities, enabling us to leverage Snowflake's scalable and flexible architecture for improved data analytics and insights.

Database Migration Planning: Collaborate with stakeholders to understand the MySQL database structure, data dependencies, and migration requirements. Develop a comprehensive migration plan outlining the steps, timelines, and resources needed for a smooth transition to Snowflake.

ADF Pipeline Development: Design, develop, and deploy Azure Data Factory pipelines to automate the extraction, transformation, and loading (ETL) processes from MySQL databases to Snowflake. Configure data ingestion tasks, data cleansing operations, and data validation routines within ADF pipelines.

Administer the Snowflake environment, including user management, role assignments, and security configurations. Monitor system performance, troubleshoot issues, and optimize Snowflake configurations for efficient data processing and storage.

Collaborate with data architects and modelers to design Snowflake schemas and structures that align with FlyFrontier's business requirements. Implement best practices for organizing data within Snowflake to support analytics and reporting needs.

Conduct thorough testing of ADF pipelines and Snowflake data loads to ensure data accuracy, completeness, and consistency post-migration. Address any data quality issues or discrepancies identified during testing phases.

Document the migration process, ADF pipeline configurations, Snowflake architecture, and best practices for future reference. Provide training and support to FlyFrontier teams on using Snowflake and ADF for data management and analytics tasks.

Foster effective communication and collaboration with cross-functional teams, including database administrators, developers, data analysts, and business users. Provide regular updates on migration progress, issues, and resolution plans to stakeholders.

DBT CLOUD AND SNOWFLAKE DEVELOPER

HCL Technologies
03.2021 - 04.2023

Client: United States Army Association (USAA)
Project: Consumer Lending

Description:
Consumer Loan Acquisition System & Consumer Lending Origination events systems allow the member to apply for the Consumer Loan product they are Eligible for, provide an automated decision for the submitted applications, and fulfill the process of application by disbursing the loan amount or decline letter in regards to the decision. Consumer Lending origination events Hive will house the streaming data related to GSD loans to produce volume and trending over time reporting for CL product management.

For the Cloud Technology Implementation in USAA, two implementation tracks will be supported by a series of deliverables. These deliverables, which are necessary to upgrade and convert USAA to the new solution, include:

  • The migration of current data to the Cloud Database Snowflake.
  • Upgrading and enhancing USAA‘s current Consumer Lending and tracking systems.
  • Building interfaces between On-Premise (legacy systems) and the Cloud platform.
  • Rewrite the existing ETL code to Cloud ELT tool DBT (Data Build Tool).
  • Performing the gap/business process analysis.

Key benefits include Improved insights into the data, reduces manual effort to produce insights and answer questions from various partners and helpful for making the best point forward decision on the Consumer Lending portfolio.

Designer & Developer

Cognizant Technologies
07.2016 - 02.2021

Client: McKesson IT, US
Project: ETL Modernization

Description:
McKesson is the oldest and largest healthcare company in the US. Marketing, Berlex, XSSH, Natezza downstream file creation jobs, and SLX applications have been managed by me for the OPS activity. Source data is sourced from SAP ECC and other systems, loaded into Oracle Data Warehouse (IW), and sent to multiple downstream provisioning like Marketing, Natezza, etc. The ETL Modernization project aims to modernize McKesson’s legacy ETL architecture by replacing Oracle Data Warehouse (IW), IBM Datastage 7.5.2, Business Objects Data Services, and LPAD (Landing Pad) / DPA (Data Provisioning Area), with SAP BW 7.4, HANA 1.0, HADOOP, and IBM InfoSphere 11.5.

Responsibilities:

  • Supported Berlex, Marketing Natezza, XSSH, and SLX application jobs and handled all critical, high, medium, and low tickets within SLA in the Production environment.
  • Monitored jobs in Maestro scheduler and provided updates to Business stakeholders if reports were delayed for the business.
  • Analyzed DataStage 7.5 server/parallel jobs, prepared LLD, and supported team members as an Associate.
  • Reported team's status, progress, and bugs to stakeholders/audience.
  • Reviewed and analyzed detailed system specifications related to Talend ETL and related applications to ensure they appropriately addressed business requirements.
  • Automated UNIX scripting for looping mechanism by parsing multiple invocation ids for DataStage jobs.
  • Worked on exception handling and process control by updating batch ids, invocation ids in job control tables related to ETL frameworks.
  • Worked on Release management activities like Creating Cut over plan, roll out plan, attaching test evidence, and obtaining approvals from Oracle DBA's, TWS team, and Change management team.

ETL Developer

HCL Technologies
07.2012 - 06.2016

Client: Lloyds Banking Group
Project: Risk Data Warehouse

Roles & Responsibilities:

  • Interacted with the Business and the IT users of the risk platform and worked as the Onshore / Offshore coordinator for a team of 7 members.
  • Understood the process of the project and studied the existing system.
  • Understood the requirements and prepared the technical design documents (DPAD).
  • Designed and developed the ETL code with DataStage PX, Oracle. Designed and developed the jobs using various stages - Dataset, Sequential File, Change capture, Lookup, Remove Duplicates, Hashed File, Sort, Join, and Transformer.
  • Prepared Unix scripts and scheduled them using Tivoli Workload Scheduler. Worked on the RTL DataStage code movement during the implementations.
  • Prepared unit test plans and performed unit testing.
  • Performed performance tuning in DataStage & Oracle.
  • Conducted impact analysis in the application for any changes to source data.

Education

Bachelor of Technology - Computer Engineering Technology

Gitam University
Visakhapatnam, India
04.2001 -

MPC -

Sri Kakatiya
Vijayawada, India
04.2001 -

Skills

undefined

Timeline

Snowflake Admin & Developer

GAVS Technologies Private
04.2023 - Current

DBT CLOUD AND SNOWFLAKE DEVELOPER

HCL Technologies
03.2021 - 04.2023

Designer & Developer

Cognizant Technologies
07.2016 - 02.2021

ETL Developer

HCL Technologies
07.2012 - 06.2016

Bachelor of Technology - Computer Engineering Technology

Gitam University
04.2001 -

MPC -

Sri Kakatiya
04.2001 -
ASHIK RAPAKASoftware - Techinical Lead