To work in a progressive organization where I can apply my knowledge and enhance my skills to serve the organization in the most efficient way. Possess work experience in different areas like Analysis, Development, Implementation and Support for business requirements. Seeking for a challenging position in Data warehousing ETL, Business Intelligence and quality process in a dynamic environment where seamless integration of creativity and technology could be experienced with scope for professional advancement.
I am an associate with Tata Consultancy Services for the last 11 years having functional expertise in Telecom and Banking domains. My technological forte has been with Data warehousing and platform primarily with Informatica and PL/SQL technologies.
LRADS is Loss Reserving Analytical Data Store.
The purpose of LRADS is to serve as a source for premium and loss data used for estimation of reserves, trend analytics and catastrophe analytics. The name of the database is VB_LOSS_RESERVING_ADS which is SQL Server DB. All of the LRADS processes extract financial premium and loss data from Netezza ware house and loads to the LRADS Datamart. As part of Modernization effort Netezza and SQL server warehouse is going to be replaced by Snowflake.
Currently as part of Modernization, replacing Informatica with DBT and Netezza to Snowflake.
Converting Informatica Mapping to DBT Models with CTEs as per the design and committing latest Models to GIT and the migrating to higher environments.
• Analyzing the business requirements with Business Analyst to prepare mapping sheets for ETL procedures that are consistent across all systems to migrate Informatica components from Netezza to Snowflake.
• Documenting at a functional level how the Informatica mappings work.
• Monitoring batches and sessions using informatica Powercenter server.
• Working with BusinessTeams to create design specifications to maintain Data accuracy while migrating from Netezza to Snowflake.
• Have shared the best practices followed in Datawarehouse, ETL Techniques, Common Recurring Issues, across teams as applicable through sessions, mailers etc.
• SAND – Sainsbury Archiving and Decommission is an existing bank which is associated with LBG.
• All the SAND applications to be decommissioned and data to be deleted. MBNA is a new bank getting integrated with the LBG.
• Worked on ETL changes required to Decommission existing bank and to add new bank. Worked on Impact analysis and Multi level documentation for project that would provide impact and solution for Datastage jobs on a broader scale.
• Worked on Datastage Server Migration activities from 9.1 to 11.3.
• Handling Migration scripts that would migrate files from old server to new server. Working on handling of data stage production issues like monitoring of jobs, Database.
• Operational Risk & Compliance Integrated Tech (ORCIT), Enhance and Simply Risk and Compliance framework to enable end-to-end integration of business, compliance and operational risk process and control, to enhance reporting of standards metrics in line with lows and regulation.
• Interacted with the Business Analysts to understand the business & gather technical requirements. Abstracted relational and delimited text file to local environment and developed the code in dev environment.
• Automated and scheduled recurring reporting processes using Unix shell scripting and teradata utilities. Checked the file formats for the source flat files using Unix shell scripts thereby ensuring input file formats are same as specified. Created Mapplet for error handling process to deal with null records. Involved in migration/automation processes for building and deploying systems.
• Developed code for landing environment and then for staging and finally developed incremental load populate the target tables for atomic model. Extensively involved in writing SQL queries (sub queries and join conditions) and PLSQL Procedure.
• COLO2- Creating Disaster recovery platform to safeguard data against natural or unplanned outages at a production site by having a recovery strategy for applications and data to a geographically separate standby site. Enterprise deployments need protection from unforeseen disasters and natural calamities.
• One protection solution involves setting up a standby site at a geographically different location than the production site.
• The standby site may have equal or fewer services and resources compared to the production site.
• Application data, metadata, configuration data, and security data are replicated to the standby site on a periodic basis. This Program has two phases. One is direct copy of all the tables in production of ICDW platform into Disaster Recovery platform. And second phase is synchronizing the data from ICDW platform to DR platform.
• Colo TS (Timestamp) is one of the module in COLO program which handles addition of an attribute to identify delta records for existing tables. Enhanced the existing code to meet the requirements (code and DB Component changes).
• Implementing Informatica code changes to identify new records. Creating Teradata Data mover jobs that identifies the new data to be transferred to the COLO server. Monitoring the data mover jobs and fixing if needed.
• Performed unit testing and migrated to higher environments. Address the defects raised during the Quality Assurance.
• Wave6 is a Integrated consumer Data Warehouse program.
• Wave6 program of JPMC is to build a Integrated system/model at various levels like Account level, House hold level etc. Existing model of JPMC is having different marts like Customer (CST), Know your customer(KYC),Compliance(CMP),Relational data mart (RDM) etc Migration of the code from Abinitio to Informatica and DB2 to Teradata.
• In KYC, translations are identified for unauthorized groups. The new ICDW model's objective is to consolidate all the marts as a single entity.
• As part of this Initiative the new data ware house is built on Teradata using Informatica in the Integrated Consumer Data Warehouse (ICDW) Architecture. Captured the existing EDW system logic using a mechanism called RE (Reverse Engineering). Prepared mapping/Requirements/High level design document by following the business rules from the existing system.
• Developed Informatica mappings, workflows by enabling PDO (Pushdown Optimization) using requirements documents.
• Developed the necessary components in Teradata Database. Address the defects raised during the Quality Assurance testing and User Acceptance Testing cycles.
• Prepare necessary documentation for production implementation.
• Century Link is a multinational communications company headquartered in Monroe, Louisiana. It provides communications and data services to residential, business, governmental and wholesale customers. The company, incorporated as Central Telephone & Electronics Corporation in 1968, later changed its name to Century Telephone Enterprises, Inc. in 1971, and then was called CenturyTel, Inc. from 1999 to 2010.
• Netezza to Oracle Exa-Data Migration: Current external hosting solution with TEOCO for Netezza appliance needs to be evaluated for migration to internal Oracle Exa-Data appliance. TEOCO is currently providing the Landing Zone Box, Business Objects Services and the Netezza appliance box for $45K per month for up to 12 Terabytes.
• Migrating this data to an In-house Education Oracle Exa-Data environment would save approximately 540K per year of external spend (offset by in-house environment costs).
• Developing Mapping based on the requirements, developing the corresponding Sessions and Workflows, unit testing the code and successfully completing the deliverables within the time.
• Timely monitoring the jobs in all environments. Performing database software installations and configurations.
• Creating database objects like tables, views,indexes, procedures etc. Managing roles and privileges for database users based on their roles and responsibilities. Monitoring the database and taking timely actions to ensure the database performance. Splitting and managing the database into different environments.
• Supporting database users for all DBA dependent activities. Performing the data model changes on database.
• Timely monitoring the database resource utilization by the users to avoid space concerns.
Informatica Power Center
undefinedAssistant Consultant