Summary
Overview
Work History
Education
Skills
Websites
References
Timeline
Generic

SWETHA KANUMURI

Melbourne,VIC

Summary

Dynamic ETL Developer practiced in helping companies with diverse transitioning, including sensitive [Type] data and massive big data installations. Promotes extensive simulation and testing to provide smooth ETL execution. Known for providing quick, effective tools to automate and optimize database management tasks. Background includes data mining, warehousing and analytics. Proficient in machine and deep learning. Quality-driven and hardworking with excellent communication and project management skills. Organized and dependable candidate successful at managing multiple priorities with a positive attitude. Willingness to take on added responsibilities to meet team goals. Detail-oriented team player with strong organizational skills. Ability to handle multiple projects simultaneously with a high degree of accuracy. To seek and maintain full-time position that offers professional challenges utilizing interpersonal skills, excellent time management and problem-solving skills.

Overview

5
5
years of professional experience

Work History

ETL developer (Informatica & Snowflake)

Cognizant
08.2021 - 11.2022
  • At Key Bank, I played a pivotal role in a project focused on developing and deploying ETL applications utilizing Informatica and SQL Teradata for comprehensive data management
  • My responsibilities encompassed:
  • Development of ETL Programs: This involved creating new jobs to meet diverse business requirements, rigorously testing them with dummy data, and crafting complex Informatica mappings to address intricate business needs
  • Additionally, I utilized Informatica file watch events to monitor FTP sites for external mainframe files.
  • Automation in Job Development: I designed and implemented automated Mainframe jobs with interdependencies, managing workflows with multiple sessions using the Informatica scheduler.
  • Database Utilities: I extensively utilized Teradata utilities, including BTEQ, Fastload, Multiload, and SQL-based DDL and DML commands
  • Wrote and optimized in-application SQL statements.
  • Collaborated with business intelligence staff at customer facilities to produce customized ETL solutions for specific goals.
  • Managed data quality issues during ETL processes, directing qualitative failures to team lead for amelioration.
  • Designed and created ETL code installations, aiding in transitions from one data warehouse to another.
  • Reviewed project requests describing database user needs to estimate time and cost required to accomplish projects.
  • Documented and communicated database schemas using accepted notations.
  • I also created various Teradata Macros in SQL Assistant to support analysts while minimizing data transfer over the network using relational SQL.
  • Production Support and Debugging: My role included promoting error-free code to production, debugging issues, and providing crucial production support
  • I also performed performance tuning at both functional and mapping levels and identified bugs in existing mappings through data flow analysis and transformation evaluation using debugging tools.
  • Tools Utilization: I leveraged Hadoop and Unix commands to enhance ETL job performance without disrupting daily operations
  • I actively employed various ITSM tools and project management platforms like ServiceNow, JIRA, and Jenkins for service requests and code promotion.
  • Collaboration and Communication: I maintained open communication with business customers to address issues and gather requirements
  • Additionally, I contributed to data warehouse enhancements, including stored procedure tuning and modification for code improvements, and actively participated in team code reviews, conducting unit testing at various stages of the ETL process.
  • Experience using DBT (Data Build Tool) for transforming and modelling data to drive analytics and reporting.
  • Experience working with python.

ETL Developer (Ab Initio & Informatica)

NTT Data
06.2017 - 08.2021
  • Our team specializes in the financial domain and works with BCBSNC, a fully taxed and non-profit health insurance company located in North Carolina, USA
  • We are responsible for developing new jobs that meet various business requirements and collaborating with business users to resolve existing data issues
  • To achieve this, we employ reverse engineering techniques to gain insights into the present functionality and data flow of existing objects
  • This enables us to identify areas where specific enhancements can be applied to prevent reoccurrence of issues.
  • Responsibilities:
  • Involved in development, enhancement, problem analysis/resolution, coding, and testing phases of several requirements from business users.
  • Created new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Developed and modified existing graphs using Ab Initio components as per the business requirements.
  • Performed Transformations of source data with Transform Components like Replicate, Reformat, Filter-by-Expression, Rollup etc.
  • Worked with De-partition Components like Gather, Merge for combining the files which are partitioned for fast process of data files.
  • Developed mappings/sessions using Informatica Power Center 8.6 for data loading.
  • Used Informatica reusability at various levels of development.
  • Created shell scripts for archival process and file check.
  • Extensively handled data issues by analysing the root cause and implementing history fix as well as the incremental fix if needed.
  • Proficiently administered Control-M, overseeing job scheduling, automation, and system stability.
  • Designed and implemented automated workflows, reducing manual intervention and enhancing system efficiency.
  • Collaborated with cross-functional teams to analyze business requirements, resolved issues, and maintained comprehensive documentation for best practices in Control-M management.
  • Identified, analysed, and fixed the issues in the production system.
  • Implemented checklists while promoting to testing and production.
  • Worked with DEV, UAT and PROD support team to implement testing and load the data using ETL tools
  • Worked on multiple tools such as Incident and Change Managements (RTC, SNOW) and Jenkins for promoting the code which are part of internal processes and procedures in the project.
  • Wrote and optimized in-application SQL statements.
  • Managed data quality issues during ETL processes, directing qualitative failures to team lead for amelioration.
  • Collaborated with system architects, design analysts and others to understand business and industry requirements.
  • Gathered, defined and refined requirements, led project design and oversaw implementation.
  • Designed data models for complex analysis needs.

Education

Jawaharlal Nehru Institute of Technology Hyderabad
India
06.2016

Skills

  • SQL
  • PL/SQL
  • Python
  • Unix
  • Teradata V2R62
  • SQL Server
  • DB2
  • UNIX
  • Windows 2008/12/16
  • Mainframes
  • Informatica PowerCenter (713/861)
  • Ab Initio (GDE 3132, Co>Op 215, WEB EME 30)
  • DBT
  • Decision-Making
  • Data Management
  • Excellent Communication
  • Social Perceptiveness
  • Planning and Coordination
  • Team Management
  • Team Building
  • Multitasking Abilities
  • Written Communication
  • Analytical Thinking
  • Project Planning
  • Calm Under Pressure
  • Supervision and Leadership
  • Critical Thinking

References

References available upon request.

Timeline

ETL developer (Informatica & Snowflake)

Cognizant
08.2021 - 11.2022

ETL Developer (Ab Initio & Informatica)

NTT Data
06.2017 - 08.2021

Jawaharlal Nehru Institute of Technology Hyderabad
SWETHA KANUMURI