Summary
Overview
Work History
Education
Skills
Websites
Certification
Accomplishments
Personal Information
Languages
Timeline
Generic

Saisha Hardikar

Melbourne,VIC

Summary

Data Engineer with 8+ years of experience with multiple consulting projects involving big data and cloud technologies. Currently, working as a Data Engineer with Cevo, Australia leading a small team of 4 where we deliver valued outcomes, keeping the quality and timelines intact. Sound experience of working with agile products like handling the scrum master role for the team as well in addition to data engineering capabilities.

Overview

5
5
years of professional experience
1
1
Certification

Work History

Data Engineer- Team Lead

Cevo-David Jones
Melbourne, VIC
06.2024 - Current
  • Collaborated with cross-functional teams to gather requirements and translate business needs into technical specifications for data solutions.
  • Picked up new work on top existing platform written for David Jones as a part of an earlier project that I worked on by establishing data pipelines for new systems to send data to David Jones.
  • Worked on creating views on top of data to enhance data visualisation when data analysts need to look at data from multiple sources and build analytical models on top of them.
  • Created and implemented complex business intelligence solutions.
  • Collaborated with other departments to develop effective solutions that meet customer needs.
  • Reviewed completed work to verify consistency, quality, and conformance.
  • Documented data architecture designs and changes, ensuring knowledge transfer and system maintainability.
  • Managed version control and deployment of data applications using Git, Docker, and Jenkins.
  • Trained new staff in relevant processes and procedures.
  • Technologies involved- AWS(S3, cloudformation, cloudwatch, SQS, SNS, Glue, Athena, DMS, IAM, ECS), Python, Golang, docker

Data Engineer

Country Road Group
Melbourne, VIC
11.2023 - 05.2024
  • Worked on the project for separation of Country Road Group from David Jones after the de-merger of the two companies by recreating the cloud infrastructure from scratch for CRG and separating out all data and processes related to David Jones.
  • Ensured that all AWS services are written in code to formalise the process and making it easier to debug and retrieve information.
  • As a result, we made a lot of improvements to the infrastucture by finding a lot of broken links and converting into a robust, full-proof code of data pipelines using langauges like python and Golang.
  • Participated in agile development processes, contributing to sprint planning, stand-ups, and reviews to ensure timely delivery of data projects.
  • Conducted rigorous testing and validation of data pipelines to ensure accuracy and completeness of data.
  • Managed version control and deployment of data applications using Git, Docker, and Jenkins.
  • Automated data quality checks and error handling processes to ensure the integrity and reliability of datasets.
  • Documented data architecture designs and changes, ensuring knowledge transfer and system maintainability.

Data Engineer

Cevo, Australia- David Jones/Country Road Group(client)- Retail domain
Melbourne, Australia
07.2022 - 11.2023
  • Working as a data engineer for a platform and services project to manage the end-to-end delivery of AWS cloud platform for the client
  • Onboarding various types of data sources from legacy platforms into cloud (AWS)
  • Design solutions based on discussions with the client, solution architect and other stakeholders
  • Writing and maintaining program code to meet system requirements in languages- python and Golang
  • Develop and maintain several Web APIs which handle customer interaction with the DJ and CRG in-store and online (through website or mobile apps)
  • Work on improving the systems as per requests from the clients
  • Manage production deployments of the code by coordinating between multiple platform teams by raising a change request through ServiceNow.

Data Engineer

Capgemini, Australia- NAB (client)- Financial service domain
Melbourne, Australia
03.2019 - 07.2022
  • Working as a data engineer for a data ingestion project to manage the end-to-end delivery of a number of data ingestion systems
  • Ingesting various types of data from legacy platforms into cloud(AWS)
  • Design solutions based on discussions with the client, solution architect and other stakeholders
  • Writing and maintaining program code to meet system requirements, system designs and technical specifications in accordance with quality accredited standards
  • Reviewing the code to ensure that it matches with the code standards and secure practices
  • Manage production deployments of the code by coordinating between multiple platform teams by raising a change request through ServiceNow.
  • Technologies involved- AWS(S3, cloudformation, cloudwatch, SQS, SNS, Glue, Athena, DMS, IAM, ECS, appflow, route53, file gateway), Python, Golang, docker

Hadoop Developer

Capgemini, India- Johnson and Johnson- Healthcare domain
Pune, India
  • Worked as a Hadoop developer for one of the bronze applications for J&J
  • The project aimed to continuously monitor that all the designed processes are working fine and to report and resolve issues, if any, so that the system is free of all defects at all times to ultimately report exact information to the end-user
  • To check the status of the coordinators in the sequence in oozie on a daily basis
  • Verify that the latest partitions are loaded in the hive tables in the meta store manager, which the final user are ultimately going to consume
  • Publish a daily report with the job status intimating the BUIT team about the job status.

Hadoop Developer

Capgemini India- HSBC- Financial services domain
Pune, India
  • Worked as a Hadoop developer for a data ingestion project
  • Responsible for coordinating with stakeholders and Gathering Business Requirements and documenting them textually and within models
  • Data was collected from RDBMS and pushed into Hadoop using Sqoop
  • Developed DDL Files and built HIVE tables
  • Used Hive data warehouse tool to analyze the data in HDFS and developed Hive queries
  • Set up the Control-M for automate the process
  • Responsible for reviewing SQL queries submitted by the whole team for performance tuning.

Education

Bachelor of Engineering (Computers) -

Savitribai Phule Pune University
India
07.2015

Skills

  • Consulting
  • AWS
  • Big data technologies
  • SQL
  • Python
  • Golang
  • Data Migration

Certification

  • AWS Solutions Architect Associate
  • Microsoft certified: Azure Data fundamentals
  • Oracle Database SQL Expert

Accomplishments

  • Team-player
  • Always putting my hand up for bigger challenges
  • Quick learner
  • Always willing to learn new things and share the knowledge gained with colleagues to improve together

Personal Information

Title: Senior Data Engineer

Languages

  • English Proficient
  • Hindi Proficient
  • Marathi Native

Timeline

Data Engineer- Team Lead

Cevo-David Jones
06.2024 - Current

Data Engineer

Country Road Group
11.2023 - 05.2024

Data Engineer

Cevo, Australia- David Jones/Country Road Group(client)- Retail domain
07.2022 - 11.2023

Data Engineer

Capgemini, Australia- NAB (client)- Financial service domain
03.2019 - 07.2022

Hadoop Developer

Capgemini, India- Johnson and Johnson- Healthcare domain

Hadoop Developer

Capgemini India- HSBC- Financial services domain

Bachelor of Engineering (Computers) -

Savitribai Phule Pune University
Saisha Hardikar