Summary
Overview
Work History
Education
Skills
CORE COMPETENCIES
Awards
Timeline
Generic

ARUNA BATHALA

Westmead,Australia

Summary

A highly capable Data Engineer with significant experience as Azure Databricks Engineer and ETL DevOps lead with over 10.3 years’ experience. Demonstrated excellent leadership skills in managing multiple Onsite and Offshore project assignments. Extremely capable of working under pressure and tight timeframes while delivering all requirements on time. A very organized and dependable candidate successful at managing multiple priorities and challenges with a positive attitude. Demonstrated willingness to take on added responsibilities to meet team goals. Committed to delivering exceptional results through strategic planning, meticulous execution and comprehensive stakeholder engagement.

Overview

10
10
years of professional experience

Work History

Azure Databricks Lead

Optus
05.2023 - Current

Company Overview: Optus ODM program is tasked with decommissioning ODS (Operational Data Store).ODS is an Oracle system that gets the data from Jarvis/BCC/Amdocs

Jarvis is Optus Provisioning and Billing system. The data generated from the same is pushed onto ODS system

Jarvis and some usage data forms the input to ODS.ODS has its own data pipelines and has downstream interfaces (IMLs - Interface Master List) and reports (RMLs - Report Master List) consuming this data


  • Design and implement database solutions in Azure SQL Data Warehouse, Azure SQL
  • Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data factory, Azure SQL
  • Data Ingestion to one or more Azure Services – (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in AZURE Databricks
  • Using Airflow Automation tool to run jobs in Batch wise in Databricks
  • Evaluated emerging technologies related to Microsoft Azure services to identify areas for future growth or improvement initiatives within the organization''s IT landscape.
  • Streamlined deployment processes through the use of Azure DevOps pipelines, reducing time to production and increasing efficiency.
  • Engage with business users to gather requirements, design visualizations to use reporting tools
  • Maintained up-to-date documentation regarding system configurations, network topology diagrams, and change management records as part of ongoing efforts towards transparency.
  • Collaborate with application architects and DevOps
  • Identify and Implement best practices, tools and standards
  • Build Complex distributed systems involving huge amount data handling, collecting metrics building data pipeline, and analytics
  • Proved successful working within tight deadlines and a fast-paced environment.

DATASTAGE ETL DevOps Lead

NEWSCORP
07.2021 - 05.2023

Company Overview: NEWSFACT PROJECT

This Project involved dealing with NEWSCORP Finance and Adsales data.This data consumes data from multiple resources and understands the business requirements, preparing the Data Model, creating design documents jobs in BAL layer.The process that runs and snaps the transaction data every week from sources and provides reports to end users.

  • Developed Datastage Jobs to load Collections data from multiple sources Aspect,CACS,Strata and FDR to the respective Dimensions and Fact tables with required business transformations.
  • Finance Month Ends and year end runs are handled end to end.
  • Involved in all phases of SDLC, created detailed Analysis-Design documents with source to target mappings.
  • Developed and maintained accurate project documentation and data model diagrams to provide management with proper understanding of organizational needs.
  • Prepared technical data flow proposals for enhancements and integration of existing third-party data.Communicated with business users and project management to get business requirements and translate to ETL/ELT specifications.
  • Provided technical support to both business team and user departments for all projects.
  • Handled multiple offshore, nearshore resources during the project and delivered all requirements on time.
  • As a part of raw data archival I had worked in a project developing a framework to load all raw files into foundation layer. Coding was done on Unix Shell Scripting. All data were loaded into foundation layer with a BIGSQL compatible Hive tables built on top of it.The Hadoop data is read through Hive query language and is being used for discovery purpose.
  • Worked on migrating Datastage jobs from InfoSphere Information Server, Version 9.1 to InfoSphere Information Server, Version 11.7. My primary task was to gather migration requirements, design topology/playbook, test jobs in QA server ver 11.7 and production implementation/support.

DATASTAGE ETL DEVELOPER

ALLY FINANCIALS
04.2018 - 07.2021

Company Overview: Ally Bank offers savings products, including certificates of deposits, online savings accounts, interest checking accounts and money market account. Ally bank is a member of the Federal Deposit Insurance Corporation

  • Developed Datastage Jobs to load Collections data from multiple sources Aspect, CACS, Strata and FDR to the respective Dimensions and Fact tables with required business transformations
  • Involved in all phases of SDLC, created detailed Analysis-Design documents with source to target mappings
  • Developed and maintained accurate project documentation and data model diagrams to provide management with proper understanding of organizational needs
  • Prepared technical data flow proposals for enhancements and integration of existing third-party data
  • Communicated with business users and project management to get business requirements and translate to ETL/ELT specifications
  • Provided technical support to both business team and user departments for all projects
  • Handled multiple offshore, nearshore resources during the project and delivered all requirements on time
  • As a part of raw data archival I had worked in a project developing a framework to load all raw files into foundation layer
  • Coding was done on Unix Shell Scripting
  • All data were loaded into foundation layer with a BIGSQL compatible Hive tables built on top of it
  • The Hadoop data is read through Hive query language and is being used for discovery purpose
  • Worked on migrating Datastage jobs from InfoSphere Information Server, Version9.1 to InfoSphere Information Server, Version11.7
  • My primary task was to gather migration requirements, design topology/playbook, test jobs in QA server ver11.7 and production implementation/support
  • ALLY FINANCIALS - MARS ADVANTAGE

DATASTAGE DEVELOPER

UNITED SERVICES AUTOMOBILE ASSOCIATION
09.2014 - 04.2018

Company Overview: CREDIT CARD CONVERSION PROJECT Large project to integrate the single entity credit cards to dual entity credit cards. This migration enabled a cross-border portfolio view for all credit card applications and enabled maintaining same plastic number across history for analytics usage

  • Developed Datastage jobs to ETL transformations with the requirement provided and load respective Dimensions and fact tables
  • Involved in implementation, coding and Integration
  • Developed Bridge table which has all sensitive data of Credit Card Line of Business
  • Worked in Control-M to Schedule and Monitor ETL Jobs
  • The primary objective of the projects is to maintain Extract Transform Load (ETL) portfolio of projects at enterprise level
  • In Level-3 production support role, provided quick problems resolution to daily, weekly and monthly processing cycles executing in Datastage
  • In parallel working with development works, I have established an exemplary record of providing successful system support and delivering business value for mid-level to large business intelligence applications
  • Also worked on service requests developing ETL Datastage jobs for small business requirements
  • DATA MANAGEMENT RETURN TO SERVICE-PRODUCTION SUPPORT

Education

Bachelor of Engineering - Computer Science

Velammal Institute of Technology
Chennai, Tamil Nadu
05-2014

Skills

  • Expertise in Azure Databricks, Azure SQL Database and Azure Devops Pipeline
  • Expertise in IBM DataStage (85,91,115 and117 versions) to perform ETL & ELT operations on data
  • Proficiency in Unix Shell Scripting, Pyspark
  • Proficiency in writing and debugging complex SQL
  • Hands on experience on Hadoop (Hive) IBM DB2 /Netezza/ SQL Server/Oracle
  • Data archival into IBM BIGINSIGHTS with HIVE tables
  • Experience in Airflow and Control-M Scheduling Tool

CORE COMPETENCIES

  • Experience on Migrating SQL database to Azure data Lake, Azure SQL Database, Data Bricks and Azure SQL Data warehouse and controlling and granting database access and Migrating on premise databases to Azure Data lake store using Azure data factory.
  • Experienced with cloud architecture, automation, and troubleshooting in Azure environments. Utilizes deep knowledge of Azure services to enhance system performance and reliability. Strong understanding of infrastructure as code and continuous integration/continuous deployment (CI/CD) pipelines.
  • Design and implement database solutions using Pyspark, Azure SQL Data Warehouse, Azure SQL.
  • Experienced ETL operations Lead, ETL DataStage Developer, Hadoop (Hive) Engineer and Production Support Analyst requiring subject matter expertise when dealing with mainframe sources with COBOL/ASCII File structure, Distributed sources (RDBMS) and handling business intelligence data with DB2, Netezza, Oracle, SQL Server and Hadoop Hive Data Warehouses. Certified and skilled in Azure Fundamentals, Datawarehouse concepts, Databricks.
  • Worked on developing and supporting major Banking Credit card conversion and Member Debt solutions project for Unites Services Automobile Association and multiple Domains like Retail, Media.
  • Demonstrated ability to analyze operations and identify areas of improvement, supporting leaders and guiding teams through change to facilitate accurate and detailed work scopes and project plans.
  • Strong understanding of IT domains, financial products, risk management and impact of technology/business change.
  • Ability to analyze large datasets, identify trends and extract meaningful insights for business reporting purposes.
  • Talent for utilizing effective communication strategies, interpersonal, tact and diplomacy skills to manage, influence, negotiate and foster long-term strategic relationships with internal and external stakeholders.
  • Strong leadership, capable of managing key strategic initiatives within cross -functional teams, providing guidance, mentorship and fostering a culture of continuous improvement.

Awards

DATA DESIGNER COMMITTED AWARD – NEWSCORP [MAY2024]

PAT ON THE BACK AWARD – TECH MAHINDRA [MARCH2024]

BEST PERFORMANCE FOR PROFESSIONALISM AWARD - TECH MAHINDRA [OCTOBER2023]

BRAVO AWARD – TECH MAHINDRA [MARCH2023]

AQT NINJA AWARD – TECH MAHINDRA [MARCH2023]

DISTINGUISHED ACHIEVER AWARD- TECH MAHINDRA [DEC2022]

BRAVO AWARD – TECH MAHINDRA [SEPTEMBER2021] 

BEST EMPLOYEE AWARD - HCL TECHNOLOGIES [JAN2018]

Timeline

Azure Databricks Lead

Optus
05.2023 - Current

DATASTAGE ETL DevOps Lead

NEWSCORP
07.2021 - 05.2023

DATASTAGE ETL DEVELOPER

ALLY FINANCIALS
04.2018 - 07.2021

DATASTAGE DEVELOPER

UNITED SERVICES AUTOMOBILE ASSOCIATION
09.2014 - 04.2018

Bachelor of Engineering - Computer Science

Velammal Institute of Technology
ARUNA BATHALA