Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Findo Edathirithikaran

Summary

Skilled Data Engineer with 10+ years of work experience in data engineering, cloud computing and possessing in-depth knowledge of data modelling, security, data manipulation techniques and computer programming paired with excellent communication and project management skills

Overview

11
11
years of professional experience
4
4
Certification

Work History

Data Engineer

Cleanaway
02.2022 - Current
  • Leading the design, development, and implementation of data pipelines and infrastructure for multiple revenue generating and revenue saving projects, and QA testing
  • Collaborating with the team to develop data acquisition strategies and ensuring the accuracy, completeness, and quality of the data being collected
  • Overseeing and developing data models, ETL & BAU processes to enhance the capabilities of EDW for multiple projects and BAU initiatives .
  • Communicating data-driven insights and recommendations to senior leadership and other stakeholders
  • Providing technical leadership and guidance to the team on data-related matters, such as data architecture, data engineering best practices
  • Reduced technical debt by refactoring legacy code and implementing modern development methodologies.
  • Provided technical guidance and mentorship to junior team members and new joinees, fostering a collaborative learning environment within the organization.

Data Engineer

Coles
12.2019 - 02.2022

Domain - Retail

  • Build scalable solutions capable of ETL processes using Python/Spark
  • Performed a POC to extract data from Azure Datalake and migrate it to Snowflake and created a self-serviceable config driven ETL framework to implement the same for all subject areas
  • Performed POC to implement Airflow as orchestrator for EDW
  • Designing architecture of ETL applications and leading team to implement it.
  • Performed in depth analysis of current pipelines in order to determine key areas that could maximize ROI
  • Implementing and Optimizing existing ETL jobs for data processing
  • Analyzed complex data and identified anomalies, trends and risks to provide useful insights to improve internal controls
  • Developed, implemented and maintained data analytics protocols, standards and documentation
  • Prepared documentation and analytic reports, delivering summarized results, analysis and conclusions to stakeholders
  • Creating CI/CD pipeline using Github, Terraform

Technology Analyst ETL

Infosys
06.2013 - 12.2019

Domain : Banking & Superannuation

  • Participated in multiple projects at the same time and lead of team of 5 from end to end ( requirement gathering to Prod live) and performed activities including stakeholder engagement and getting acceptance signoff's)
  • Designed and created ETL pipelines using IBM Infosphere Datastage to extract data from multiple source systems into EDW run on Oracle-Exadata for multiple projects including AML/FATCA/PBOP (Personal banking Origination platform)
  • Lead data reconciliation exercise to ensure data consistency between upstream, EDW and downstream systems ; identified areas where inconsistency existed along with reasoning and suggested fixes for the same.
  • Created user forms on RESTfull Web API Oracle APEX for multiple business users and integrated the application with EDW ETL framework enabling user API call driven pipeline execution for multiple projects including APRA Reporting/Campaigning initiatives.
  • Collaborated with cross-functional teams to develop innovative technology solutions for business challenges.
  • Creating data pipelines to process real-time data and calculations based on values enter by user through Oracle APEX forms.
  • Created Source to Target mapping documents for various projects and data lineage
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability
  • Employed data cleansing methods, significantly Enhanced data quality
  • Created reports, dashboard for business application on Tableau
  • Prepared user documentation for hands on training for end user community.
  • Used Bash/Linux shell scripting to design and update databases
  • Managing the change management process using BMC Remedy tool, Git

Education

Bachelor of Engineering - Electronics & Telecommunications

Mumbai University
Mumbai , India

Skills

  • Programming : Python, PL/SQL
  • Azure Services: DataFactory , Databricks , EventHub, Synapse , Data warehouse, Storage Account
  • Framework : Apache Spark, Pyspark, Pandas, Matplotlib
  • Databases : Synapse, Snowflake, Sql Server, Oracle Exadata
  • Data Modelling Tool : SQLDBM
  • Data Visualization : Tableau,
  • Azure Devops Github, Jira, CI/CD, Confluence
  • Shell Scripting
  • Agile methodologies : Scrum, Kanban

Certification

  • Certified ScrumMaster®
  • Azure Fundamentals
  • Tableau Desktop Specialist
  • PCAP – Certified Associate in Python Programming

Timeline

Data Engineer

Cleanaway
02.2022 - Current

Data Engineer

Coles
12.2019 - 02.2022

Technology Analyst ETL

Infosys
06.2013 - 12.2019

Bachelor of Engineering - Electronics & Telecommunications

Mumbai University
Findo Edathirithikaran