Summary
Overview
Work History
Education
Skills
Websites
Projects
Personal Information
Timeline
Generic
PHANEENDRA BABU TALASIL

PHANEENDRA BABU TALASIL

Docklands,Australia

Summary

Accomplished Cloud Data Engineer with expertise in Ab Initio ETL development and data warehousing at ITIDATA. Proven track record in optimizing ETL performance and implementing CI/CD pipelines, enhancing data integration and reliability. Strong analytical skills combined with effective collaboration with stakeholders to drive data-driven decision-making. AWS-certified and proficient in Python scripting.

Overview

11
11
years of professional experience

Work History

Cloud Data Engineer

ITIDATA an EXL Company
Melbourne, Australia
03.2022 - Current
  • Designed, developed, and maintained complex ETL pipelines using Ab Initio, ensuring accurate and efficient data integration from diverse sources into the data warehouse.
  • Collaborated with business stakeholders to define data requirements, translated business needs into ETL workflows, and ensured high-quality data availability for analytics and reporting.
  • Leveraged AWS S3 for scalable and secure data storage, implementing data partitioning and compression strategies to optimize query performance.
  • Demonstrated proficiency in integrating systems with various technologies including MQ's, API's, and Kafka, following best practices for reliable data flow.
  • Implemented CI/CD concepts and created pipelines in Jenkins and Ab Initio EME for automated and efficient deployment.
  • Collaborated with data architects to design and implement data warehousing solutions, ensuring efficient data storage, retrieval, and integration.
  • Proficiently worked with databases including Postgres, Oracle, and Mongo Db, ensuring seamless data integration across multiple platforms.
  • Developed Python scripts for extracting data from web services API's and loading into databases.
  • Conducted data analysis using SQL and Python to derive insights and support decision-making processes.
  • Design, develop, and maintain CI/CD pipelines using GitHub Actions, enabling automated triggers for build, test, and deployment workflows.
  • Create and manage workflows for code integration, automated testing, and deployment to staging and production environments.
  • Review, manage, and approve Pull Requests (PRs) following development standards and best practices. Develop and maintain ETL processes on EC2-hosted Linux machines, ensuring scalability, performance, and reliability.
  • Coordinate and document Change Requests (CRs) for production deployments, following ITIL or internal governance processes.
  • Collaborate with development, QA, and operations teams to ensure seamless delivery of new features and infrastructure changes.
  • Monitor and troubleshoot deployment pipelines and EC2-based applications, ensuring high availability and performance.
  • Proficient in CI/CD pipeline development using Azure DevOps, PowerShell, and ARM/Bicep templates to automate deployment and validation across environments. Skilled in Infrastructure as Code with Bicep, enabling the provisioning of scalable and secure Azure resources, including VNets, NSGs, Storage Accounts, and Key Vaults.
  • AWS-certified cloud engineer skilled in designing and implementing scalable architectures to enhance performance and reliability.






Senior Associate

ITIDATA and EXL Company
Hyderabad, India
10.2017 - 02.2022
  • Company Overview: Incandescent Technologies is a technology solutions provider specializing in data management and analytics.
  • Coordinate with Business and data analysts for better enhancements.
  • Enhanced ETL performance by implementing best practices, optimizing query performance, and utilizing parallel processing techniques.
  • Collaborated with data architects to design and implement data warehousing solutions, ensuring efficient data storage, retrieval, and integration.
  • Design ETL processes to extract data from various internal and external sources, ensuring data integrity and accuracy during extraction.
  • Develop Ab Initio graphs and scripts to perform data transformations, aggregations, calculations, and enrichments necessary for calculating net income revenue.
  • Apply data quality checks to identify and handle anomalies, inconsistencies, and missing values.
  • Fine-tune ETL processes for optimal performance, including optimizing query execution, data partitioning, and parallel processing.
  • Identify and eliminate bottlenecks in the ETL workflow to minimize processing time.
  • Integrate data from multiple sources to create a consolidated view of net income revenue, incorporating reconciliation processes where required.
  • Implement data validation and reconciliation routines to ensure the accuracy and consistency of net income revenue data across various stages of the ETL process.
  • Monitor data quality metrics and address data quality issues promptly to maintain the reliability of financial reporting.
  • Incandescent Technologies is a technology solutions provider specializing in data management and analytics.

Associate Consultant

Capgemini
Chennai, India
11.2016 - 11.2017
  • Collaborate with business stakeholders to understand the requirements and data sources related to online transaction processing.
  • Design ETL processes to extract data from various sources, including databases, APIs, and event streams, while ensuring minimal latency.
  • Develop Ab Initio graphs and components to transform, cleanse, and enrich incoming data in real-time, conforming to business rules and requirements.
  • Optimize loading processes to minimize downtime and impact on system performance.
  • Design ETL workflows to operate in an event-driven architecture, responding to incoming events and triggers to process transactions promptly.
  • Collaborate with data governance teams to ensure compliance with data quality standards.

Software Engineer

Tech Mahindra
Bhubaneswar, India
03.2014 - 08.2016
  • Design and develop ETL processes to extract data from various sources such as databases, flat files, and APIs, and integrate them into the OLAP system.
  • Apply dimensional modeling techniques to design efficient data structures (e.g., star schema or snowflake schema) that support analytical querying and reporting.
  • Create appropriate dimensions and fact tables to model collections and recovery data effectively.
  • Optimize ETL processes and cube performance for efficient querying and reporting by implementing indexing, partitioning, and aggregation strategies.
  • Implement data validation rules and data quality checks to ensure that the OLAP data accurately represents collections and recovery activities.
  • Collaborate with business analysts and subject matter experts to gather requirements related to collections and recovery analytics.

Education

Computer Science and Engineering -

Jawaharlal Nehru Technological University
05.2013

Skills

  • Ab Initio
  • Data Engineer
  • Data Warehousing
  • Oracle
  • Python Scripting
  • Dimensional modeling
  • Cloud Engineer
  • API Integration
  • Agile Methodologies
  • MQs and Kafka
  • CI/CD pipelines
  • Financial reporting
  • Risk management
  • Data Governance
  • GitHub
  • Cloud Infrastructure Deployment and Configuration

Projects

ANZ Bank, 03/2020, Present, Melbourne, Australia, This project deals with a number of daily feeds of demand deposit accounts and transactional data which undergo transformations to make decisions on whether transactions are authorized or dishonored., Designed, developed, and maintained complex ETL pipelines using Ab Initio, ensuring accurate and efficient data integration from diverse sources into the data warehouse., Collaborated with business stakeholders to define data requirements, translated business needs into ETL workflows, and ensured high-quality data availability for analytics and reporting., Leveraged AWS S3 for scalable and secure data storage, implementing data partitioning and compression strategies to optimize query performance., Engineered API solutions to enable seamless data exchange between internal systems, improving data accessibility and reducing latency., Demonstrated proficiency in integrating systems with various technologies including MQ's, API's, and Kafka, following best practices for reliable data flow., Maintained comprehensive documentation of ETL processes, including data lineage, transformations, dependencies, and business rules., Embraced Agile methodologies for project management, ensuring iterative development, collaboration, and rapid adaptation to changing business needs., Implemented CI/CD concepts and created pipelines in Jenkins, and Ab Initio EME for automated and efficient deployment., Collaborated with data architects to design and implement data warehousing solutions, ensuring efficient data storage, retrieval, and integration., Maintained documentation for ETL processes, data dictionaries, and workflows, facilitating knowledge sharing and ensuring consistent data management practices., Proficiently worked with databases including Postgres, Oracle, and MongoDB, ensuring seamless data integration across multiple platforms., File-based and API integration Event driven architecture Ab Initio configuration tools Building Rest Services. CITI Bank, 11/2017, 02/2020, Hyderabad, India, This project deals with detailed transactions for Citi products like Loans, Securities, Deposits, Cards and Cash for net interest rate calculations. Each product processes more than 10 million in a day., Coordinate with Business and data analysts for better enhancements., Enhanced ETL performance by implementing best practices, optimizing query performance, and utilizing parallel processing techniques., Collaborated with data architects to design and implement data warehousing solutions, ensuring efficient data storage, retrieval, and integration., Design ETL processes to extract data from various internal and external sources, ensuring data integrity and accuracy during extraction., Develop Ab Initio graphs and scripts to perform data transformations, aggregations, calculations, and enrichments necessary for calculating net income revenue., Apply data quality checks to identify and handle anomalies, inconsistencies, and missing values., Fine-tune ETL processes for optimal performance, including optimizing query execution, data partitioning, and parallel processing., Identify and eliminate bottlenecks in the ETL workflow to minimize processing time., Integrate data from multiple sources to create a consolidated view of net income revenue, incorporating reconciliation processes where required., Implement data validation and reconciliation routines to ensure the accuracy and consistency of net income revenue data across various stages of the ETL process., Monitor data quality metrics and address data quality issues promptly to maintain the reliability of financial reporting. TRANSUNION, 11/2016, 11/2017, Chennai, India, OLM is an online process which deals with data management for Transunion users to do operations on entries in CIBIL like merging records for same users, updating user information as request received from OML portal then web services will route the request to the corresponding service and do the fulfillment and the response must be sent back to portal., Collaborate with business stakeholders to understand the requirements and data sources related to online transaction processing., Design ETL processes to extract data from various sources, including databases, APIs, and event streams, while ensuring minimal latency., Develop Ab Initio graphs and components to transform, cleanse, and enrich incoming data in real-time, conforming to business rules and requirements., Load processed data into the target OLTP system, ensuring timely and accurate updates to the database., Optimize loading processes to minimize downtime and impact on system performance., Design ETL workflows to operate in an event-driven architecture, responding to incoming events and triggers to process transactions promptly., Collaborate with data governance teams to ensure compliance with data quality standards. SYNCHRONY, 03/2014, 08/2016, Bhubaneswar, India, Collections and Recovery DW mainly hold data for delinquent accounts, dialer information and associate information. There are three main subject areas in Collections DW are Current/Delinquent, Dialer and HR. This DW is 2.5 TB in Size. Every month close to 150 GB of data is processed., Design and develop ETL processes to extract data from various sources such as databases, flat files, and APIs, and integrate them into the OLAP system., Develop Ab Initio graphs and components to perform data transformations, aggregations, and calculations required for generating meaningful insights in the OLAP system., Apply dimensional modeling techniques to design efficient data structures (e.g., star schema or snowflake schema) that support analytical querying and reporting., Create appropriate dimensions and fact tables to model collections and recovery data effectively., Optimize ETL processes and cube performance for efficient querying and reporting by implementing indexing, partitioning, and aggregation strategies., Implement data validation rules and data quality checks to ensure that the OLAP data accurately represents collections and recovery activities., Adhere to change management processes for deploying ETL and OLAP updates to production environments., Conduct thorough testing to ensure the accuracy of data transformations, aggregations, and OLAP cube calculations., Collaborate with business analysts and subject matter experts to gather requirements related to collections and recovery analytics.

Personal Information

Title: Data Engineer

Timeline

Cloud Data Engineer

ITIDATA an EXL Company
03.2022 - Current

Senior Associate

ITIDATA and EXL Company
10.2017 - 02.2022

Associate Consultant

Capgemini
11.2016 - 11.2017

Software Engineer

Tech Mahindra
03.2014 - 08.2016

Computer Science and Engineering -

Jawaharlal Nehru Technological University
PHANEENDRA BABU TALASIL