Summary
Overview
Work History
Education
Skills
Certification
ADDITIONAL INFORMATION
Timeline
Generic

Abhishek Kumar

Melbourne

Summary

Associate Manager with 16+ years of experience in building and operating cutting-edge technology solutions and managing the implementation and support of systems critical to Information Technology and Data Warehousing. A creative thinker, adept in software development, providing Data Services across Banks in Australia, UK and Asia with good appreciation of Data domain.

Overview

17
17
years of professional experience
1
1
Certification

Work History

Associate Manager

Capgemini Australia Pty Ltd
08.2022 - Current

Consultant Data Engineer for NAB:

Data Delivery Conformance:

  • Involved in the migration of ADA silver tables to ADA DLT tables using the Databricks DLT framework. Responsible for configuring, maintaining, and writing YML and JSON files to define bronze source tables, join conditions, and Change Data Capture (CDC) logic; implemented SQL transformations to load data from staging to DLT tables.
  • Analyzed silver table data models, mapped bronze table fields to silver table fields, obtained mapping approvals from design forums, and wrote complex SQL queries to ingest data from bronze tables into silver conformance tables.

Data Delivery Ingestion:

  • Ingested file-based sources from S3 buckets into Databricks bronze tables by configuring metadata JSON files; configured the HVR tool to ingest data from Oracle and PostgreSQL source systems into Databricks staging and bronze tables. Collaborated with the platform team to install HVR agents on source systems and set up HashiCorp Vault configuration on the client server.

NDH Platform:

  • Gathered business requirements to build a DataMart in AWS Redshift for a Single Customer View, developed table DDLs using Erwin Data Modeler, assisted in DML scripting, and implemented Airflow DAGs for daily and historical event listener and staging processor workflows

Other Tasks:

  • Develops or acquires test data and runs tests to determine the accuracy of code logic to produce desired results; debugs and revises programs based on test results.
  • Debugging the program code in SIT and Production environments and delivering the errors fix after analysing the issue in the required SLA.
  • Deploying the configs and codes in SIT and Production environments using continuous Integration and continuous deployment techniques using Git and Jenkins.
  • Documenting the solution design document, support handover, creating change requests and providing functional direction to the support team for production deployment.

Technical Lead

Mindtree Ltd
04.2022 - 07.2022

Platform Architect for ANZ Bank

Risk Data Hub Platform

  • Responsible for gathering customer requirements for implementing Google Cloud Composer2 orchestration service in the Risk Data Hub platform and preparing the design document after analysing the platform needs and identifying the technical specifications after discussing with Technology Area Lead and Solution Architect.
  • Developing the Subscription Airflow Dag code using python, which polls the topic to receive messages for Common data Hub, filters messages related to the Risk data hub, and triggers the next Dag for data processing and done configuration of the CI/CD pipeline to deploy Airflow Dags to the Google Storage bucket associated with Composer2.
  • Responsible for analysing and debugging issues that occur in the production environment within service level agreements and helping to fix the problems/bugs.
  • To write python test cases for subscription Dag, perform independent testing in the development environment, and coordinate with the Common Data Hub team for system integration testing.
  • Coding as per the approved design document and completing development assignments such that the programs support the approved project requirements, following the best practices.
  • Assist in developing and implementing applications or systems.
  • To evaluate and test new or modified software programs and development procedures and ensure programmes adhere to set standards and operate according to user needs.

Technology Lead

Infosys Limited
09.2008 - 04.2022

Analyst Engineer for NAB:

Data Delivery - Customer Analytics

  • Designed and implemented (DMAAS framework) Data Management as a Service to migrate Marketing data to Teradata and Teradata to S3 bucket which is being used by other downstream systems like Adobe Campaign Management through automated pipeline.
  • Developed Unix shell scripts to extract and load data from the files (txt and csv) in NAS drive or from Oracle systems to Teradata table using BTEQ and MLOAD. Developed python script which generate Teradata extracts in csv file.
    Used Jenkins to build and deploy the DMAAS packages automatically in multiple environments.
  • Configured control-m Jobs to schedule the DMAAS Processes to load Teradata table extracts into S3 bucket.
  • Worked on CARDS Mainframe systems to modify their SQLLIB and JCL’s, also modified their Teradata tables, views, implementation scripts and deployment yml files to load the data.

Platform Architect for HSBC UK:

Financial Crime Risk Transaction Monitoring Dashboard and Juniper Platform:

  • To gather business requirements from the client through meetings with business users to understand the needs, identify gaps/issues in existing systems/applications and prepare design documents for Project Management Review. Prepare proposal documents to show how to deliver the business solution cost-effectively.
  • Responsible for writing pseudocode by researching and analysing Google Cloud and developing File/Teradata-based ingestion solutions for Juniper application using GCP and Spring Boot Microservices.
  • Designing User Interface, writing and maintaining backend code containing ETL script logic as per technical specification and quality standards for Transaction Monitoring Reconciliation Dashboard using QlikView.
  • To guide the team in developing high-quality code deliverables, performing code reviews, writing and executing test cases in non-prod environments, and preparing user manuals, operational procedures and testing documents for different projects.
  • Involved in reviewing and documenting the requirements for application support and design and implementation, Creating and reviewing System test Plans, and performing application sanity/functional testing to ensure readiness for different environments. Also, provided production support to analyse root causes, debug, and fix defects.
  • Responsible for preparing implementation plan, raising change requests, getting approvals and deploying code/applications to non-prod and prod environments through CI/CD technologies like GIT, Ansible, Jenkins, and Cloud Build.
  • Other Projects: ICICI Bank (Finacle CRM Batch Upload Framework and Re-architecture), Adidas (Transactional emails setup), ANZ (Customer Data Ecosystem)

Education

Bachelor of Engineering - Electronics and Electrical Communication

Punjab Engineering College (Deemed University)
01.2008

Skills

  • Cloud Platforms: Databricks, Google Cloud Platform, Azure, AWS
  • CI/CD: Jenkins, GitHub, Terraform, Apache-Airflow Control-M, Artifactory
  • Programming Languages: Python, Core Java, Spring boots, Micro-services
  • Databases: - MySQL, Oracle, Teradata
  • Replication Tools: HVR
  • QlikView and Qlik Sense proficiency
  • Big Data: Hadoop, HDFS, HIVE, Scoop, Kafka

Certification

  • Databricks Certified Data Engineer Associate
  • Google Cloud Certified Professional Data Engineer
  • Microsoft Certified: Azure Data Engineer Associate
  • Microsoft Certified: Azure Fundamentals
  • Oracle Certified Java Programmer 1.6

ADDITIONAL INFORMATION

References are available upon request

Timeline

Associate Manager

Capgemini Australia Pty Ltd
08.2022 - Current

Technical Lead

Mindtree Ltd
04.2022 - 07.2022

Technology Lead

Infosys Limited
09.2008 - 04.2022

Bachelor of Engineering - Electronics and Electrical Communication

Punjab Engineering College (Deemed University)
Abhishek Kumar