Summary
Overview
Work History
Education
Skills
Personal Information
Timeline
Generic
Sriramoji Sairam

Sriramoji Sairam

Confluent Kafka Engineer | Hadoop Administrator.
Hyderabad

Summary

Experienced IT professional with over 5 years of experience, including 3 years as a Confluent Kafka Administrator with GCP and 2 years as a Hadoop Administrator. Proven track record of initiative and responsibility, contributing effectively as a team player. Recognized for quick learning abilities and adeptness in exploring and implementing new technologies to enhance efficiency and drive innovation within the organization.

Overview

5
5
years of professional experience
3
3
Languages

Work History

Middleware/EAI Kafka Engineer

HCL
03.2022 - Current

PROJECT 1: Mattel,

Middleware/EAI Kafka Engineer, GCP (Kubernetes, Cloud Run, App Engine, Confluent Kafka), This project provides Mattel's primary focus is being a global leader in children's entertainment, offering a wide range of toys, games, and entertainment experiences.

Roles and Responsibilities:

  • Managing data processing pipelines on Confluent Kafka clusters.
  • Creating Kafka topics and connectors via GitLab CI/CD pipelines or deploying infrastructure using Terraform.
  • Performing administrative tasks such as pausing/resuming connectors, resetting/skipping topic offsets, and updating topic configurations (e.g., retention period, schema updates).
  • Handling end-to-end data pipeline management, including monitoring and troubleshooting production issues using ServiceNow and Jira.
  • Acting as a Kafka admin resource by coordinating with business and functional teams to address security and compliance requirements.
  • Supporting enhancement activities and new project initiatives.
  • Developing and deploying services on Kubernetes, Cloud Run, and App Engine using CI/CD pipelines.
  • Monitoring GCP logs and debugging issues; involved in provisioning and decommissioning GCP services.
  • Managing data processing pipelines on Confluent Kafka clusters.
  • Creating Kafka topics and connectors via GitLab CI/CD pipelines or deploying infrastructure using Terraform.
  • Performing administrative tasks such as pausing/resuming connectors, resetting/skipping topic offsets, and updating topic configurations (e.g., retention period, schema updates).
  • Handling end-to-end data pipeline management, including monitoring and troubleshooting production issues using ServiceNow and Jira.
  • Acting as a Kafka admin resource by coordinating with business and functional teams to address security and compliance requirements.
  • Supporting enhancement activities and new project initiatives.
  • Developing and deploying services on Kubernetes, Cloud Run, and App Engine using CI/CD pipelines.
  • Monitoring GCP logs and debugging issues; involved in provisioning and decommissioning GCP services.

Hadoop Administrator

TCS
02.2020 - 03.2022

PROJECT-2: CITI BANK

Hadoop Administrator: TCS(Contract), Hyderabad. India.

This project provides an all in one central repository for converged Data Platform that Corporate banking to integrate and analyze a wide variety of online and offline corporate clients data including the Revenue transactions, Score cards, PMI data, Deals, DealCenter, Account plan ,customer onboarding and offboarding, clickstream data, email, point of sale (POS) systems, call Report records. Relationship Mangers can analyze this data to generate insights about instructional Client’s behaviors and preferences and offer Product (FM, TB, CC etc.) recommendations such as Dynamic Pricing Across Multiple Channels, Up-Sell/Cross-Sell Recommendations and Loyalty Programs. Key to this is the ability to optimize merchandise Deals between the Client’s and Bank and Credits are tailored to individual Clients and Group Clients. The data will be stored in Hadoop file system and processed using Pig, Hive and MR jobs. Ingestion or acquisition of data will be done through Sqoop/flume.

Roles and Responsibilities:

  • Responsible for managing and scheduling jobs on a Hadoop Cluster.
  • End-to-end data pipeline management.
  • Monitoring & Troubleshooting Production Cluster in Hortonworks environment.
  • Hadoop Security support resource to coordinate with Business and Functional teams to understand Security requirements.
  • Coordinate Major projects and establish design principles for security team.
  • Support enhancements and projects.
  • Tuning system performance related configuration parameters, backing up configuration xml files.
  • Worked for Name node recovery and Balancing Hadoop Cluster.
  • Worked on commissioning and decommissioning of Data Nodes.
  • Monitoring cluster using tools like Ganglia and Nagios
  • Flume configuration for data transfer from Webservers to Hadoop cluster.
  • Administrating and optimizing the Hadoop clusters
  • Monitoring MapReduce jobs and worked with development team to fix the issues.
  • Load data from various data sources into HDFS using Sqoop and Flume.
  • Analyzing Log files for Hadoop and eco system services to find out the root cause.
  • Taking backup of Cluster Data using snapshots.
  • Working together with infrastructure, network and application teams to guarantee high data quality.
  • Troubleshooting, diagnosing, solving and managing hdfs and map Reduce job issues.

Education

Intermediate -

Board of Intermediate
Hyderabad, India
05.2010

B. Tech -

Jawaharlal Nehru Technological University
06.2014

Skills

Confluent Kafka Real Time Streaming Platform :- Confluent Kafka And Hands-on Experience on the tools such as Creation of topic and Partition,Control centre,Kafka Connect,Schema Registry Google Cloud Platform (GCP) ● Services like AppEngine, CloudRun, and Workload ●Database: GCP Firestore ● Cloud Scheduler ● Storage:GCP Bucket ●Monitoring: GCP Service Logs Monitoring Tools:Grafana, Sumologic, GCP Logs DevOps Tools: GitLab, CI/CD Pipelines APIs & Testing Tools: Postman Ticketing Tools: Jira, ServiceNow(SNOW) Program language : SQL, Python( Beginner)

Framework: HADOOP Hadoop Components: HDFS, YARN, MapReduce Hadoop Ecosystem Tools: Sqoop, Pig, Hive, HBase, Kafka, Flume, Storm, Spark, Presto, Zookeeper, Elasticsearch, Oozie, and Impala Hadoop Distributions: Apache Hadoop, Hortonworks and Cloudera Monitoring Tools: Cloudera Manager, Ambari, Nagios, Ganglia, Zabbix Security Tools: Kerberos, Ranger, Ranger KMS, Knox, Sentry

Personal Information

Timeline

Middleware/EAI Kafka Engineer

HCL
03.2022 - Current

Hadoop Administrator

TCS
02.2020 - 03.2022

Intermediate -

Board of Intermediate

B. Tech -

Jawaharlal Nehru Technological University
Sriramoji SairamConfluent Kafka Engineer | Hadoop Administrator.