Summary
Overview
Work History
Education
Skills
Websites
Certification
Summary - Purpose
Timeline
Generic

Thiago Oliveira

MacKenzie,QLD

Summary

I am a passionate and driven data professional dedicated to crafting innovative solutions that bridge the gap between technical complexity and business strategy. With deep expertise in data architecture, modelling, and engineering, I thrive on solving challenging problems to enable organisations to unlock the full potential of their data. Driven by a commitment to technical excellence, I combine hands-on expertise with mentoring and empowering others to create meaningful impact. My vision is to grow as a leader who builds systems that simplify complexity, bridging the divide between data and decision-making to deliver clear and actionable value.

Senior engineering professional with deep expertise in data architecture, pipeline development, and big data technologies. Proven track record in optimizing data workflows, enhancing system efficiency, and driving business intelligence initiatives. Strong collaborator, adaptable to evolving project demands, with focus on delivering impactful results through teamwork and innovation. Skilled in SQL, Python, Spark, and cloud platforms, with strategic approach to data management and problem-solving.

Overview

7
7
years of professional experience
1
1
Certification

Work History

Senior Data Engineer

Solution BI
07.2024 - Current
  • Aimed to model and design scalable data architecture using Kimball methodology and Microsoft Fabric
  • Designed and implemented data models following Kimball methodology to support reporting and analytics
  • Delivered end-to-end data architecture proposal designed for Microsoft Fabric, including cost analysis and computer estimation
  • Provided mentorship and technical guidance to junior team members on data modelling principles
  • Collaborated with C-level stakeholders to translate business needs into a technical solution proposal
  • Developed documentation outlining the architecture and design of data solutions for stakeholder approval
  • Developed detailed cost and resource estimation for a large-scale data project leveraging Microsoft Fabric
  • Facilitated stakeholder workshops to gather and prioritise project requirements, translating business objectives into actionable technical specifications for cost estimation models
  • Provided technical advisory on Microsoft Fabric features, including data pipelines, governance, and integration capabilities, to streamline workflows and reduce manual processes
  • Created technical and solution documentation, including cost projection and migration roadmaps, to support the proposal and implementation plan
  • Migrated legacy SQL-based data processes (Postgres) to a modern cloud architecture, using dbt (data build tool) and Snowflake
  • Led the end-to-end migration of SQL scripts into dbt models for Snowflake
  • Designed and implemented over 200 data transformations across six tenants, achieving full automation, seamless integration, and enhanced system performance
  • Enhanced pipeline scalability and reduced processing time by 30%, significantly improving reporting speed for business users
  • Collaborated with cross-functional teams to ensure alignment with business objectives and technical feasibility
  • Documented migration workflows and provided training to stakeholders for operational readiness

Data Platform Engineer

Suncorp Bank
02.2024 - 07.2024
  • Focus on separating data ownership between Bank and Insurance divisions, ensuring smooth transition of processes, systems, and governance
  • Led multiple workshops with stakeholders across sub-divisions, including Data Warehouses, Platform Engineering, Data Governance, to capture existing processes and knowledge
  • Produced comprehensive documentation of processes, systems, and ownership responsibilities to enable smooth transitions to new teams
  • Planned and proposed process improvements, including automation opportunities, leveraging technologies like Terraform, Ansible, and AWS Redshift
  • Collaborated with stakeholders to ensure smooth transition aligned with organisational goals, maintaining compliance with data governance standards
  • Played key role in understanding, maintaining, and troubleshooting systems, including IBM InfoSphere and Active Directory for access management
  • Provided technical recommendations for process optimisation and platform automation as part of Agile project delivery
  • Developed high-quality documentation, ensuring clarity and usability for transitioning teams
  • Supported development of security and governance frameworks to ensure safe and efficient data handling practices

Senior Data Engineer

BHP
01.2023 - 11.2023
  • The project aimed to transition a machine learning proof of concept into a production-ready data product for stakeholders at the mining site
  • Collaborated with a multidisciplinary team, including data scientists, developers (.NET and Python), and product managers, to make production-ready and deploy a complex data product (including backend, machine learning models, and frontend) into production
  • Designed and implemented ELT (Extract, Load, and Transform) pipelines for the data product using Python, SQL, AWS Lambda, AWS ECS, and AWS RDS
  • Ensured production processes ran twice daily with reliability and stability, achieving project goals and enabling the product’s beta release
  • Utilised AWS infrastructure services (IAM, VPC, EC2) to support deployment and scalability
  • Liaised with stakeholders to improve data ingestion patterns and ensure process stability, integrating data from multiple SQL Server instances and internal views
  • Focused on automating data ingestion and transformation processes to enable near real-time data availability for enterprise analytics and reporting
  • Automated data engineering processes using AWS DMS, Terraform, AWS Lambda, and Snowflake (including Snowpipe for automated data loading)
  • Developed and optimised extraction and ingestion pipelines, delivering near real-time data availability in Snowflake for analytics and to the data product I was responsible for
  • Collaborated with the data modelling team to model and implement Kimball dimensional modelling using dbt
  • Ensured data quality and performance across ingestion pipelines, aligning with best practices in data governance and security

Cloud Data Engineer

Genie Solutions
09.2019 - 12.2022
  • The initiative involved building Genie Solutions’ first data lake and Lakehouse platform (Databricks on AWS) to enable analytics and reporting across the organization
  • Co-founded the data team and led the evaluation and selection of the Databricks platform to build the data lake and Lakehouse from the ground up on AWS
  • Developed scalable ELT pipelines for ingesting structured, semi-structured, and unstructured data (including APIs, on-prem proprietary databases, and JSON files) into the Databricks Lakehouse
  • Set standards and co-authored data ingestion and transformation pipelines using PySpark and SQL, collaborating with a Data Scientist to deliver efficient data processes
  • Collaborated on dashboard development with the Data Scientist, gaining expertise in data visualization and enabling data-driven decision-making
  • Co-led workshops with multidisciplinary stakeholders (marketing, product, finance, engineering) to gather requirements and onboard teams onto the platform based on KPIs and strategic priorities
  • The project focused on automating and optimising the core data conversion product, a significant revenue source for the business
  • Managed automation workflows and existing data pipelines for daily data conversions using Terraform, AWS EC2, and related AWS services
  • Acted as the SME for the automation process, optimising compute resources and leading cost-reduction initiatives
  • Provided workshops and technical training to internal teams (medical installation trainers) to improve product usage and operational cost savings
  • Supported product development and cloud infrastructure enhancements, collaborating with cloud engineers and developers, to ensure scalability and reliability
  • The project involved transitioning the legacy data conversion processes to a modern, serverless architecture
  • Provided expert guidance and mentorship to junior data engineers on using Spark, AWS Glue, and Step Functions for the modernisation process
  • Collaborated with external consultants to migrate the solution from legacy EC2 instances to serverless Python-based processes, reducing costs and drastically improving scalability
  • Coordinated stakeholder interactions to ensure alignment on migration objectives and technical requirements

Data Engineering Consultant

CMD Solutions
02.2019 - 08.2019
  • This project focused on automating development environments for data scientists and facilitating the ingestion and availability of data from industrial systems
  • Automated the ingestion of real-time data from the PI Data System of Oil & Gas into AWS environments using AWS Kinesis and other AWS services, enabling seamless data access for data scientists
  • Developed Python modules within Jupyter Notebooks to preprocess and transform ingested data for machine learning workflows
  • Automated deployment workflows for Python modules, ensuring scalable and efficient execution of machine learning models
  • The project involved transitioning data processes from Qlik Sense processes to AWS Glue jobs written in PySpark for enhanced scalability and performance
  • Collaborated with the project lead to migrate existing data workflows into AWS Glue processes written in PySpark and Step Functions, improving data processing efficiency
  • Enhanced technical knowledge of PySpark through collaboration and mentorship from the tech lead

DevOps Consultant

Deloitte Australia
09.2017 - 01.2019
  • Developed Python-based ETL automation processes for an insurance company, optimising data ingestion and transformation workflows
  • Delivered automation of a Privileged Access Management (PAM) product for a large bank, streamlining workflows and reducing manual processes
  • Executed large-scale load testing and integrated automated security controls into DevSecOps pipelines to enhance compliance and operational efficiency

Education

Postgraduate Certificate in Management -

Macquarie Graduate School of Management
Sydney
11.2014

Postgraduate Diploma in Software Engineering -

Catholic University of Brasilia
Brasilia
11.2003

Bachelor of Science -

Federal University of Goiás

Skills

  • Databricks
  • Spark
  • Python
  • SQL
  • Azure
  • AWS
  • dbt
  • Snowflake
  • Team Leadership
  • Data Architecture
  • Data Modelling
  • Data Governance
  • Stakeholder Engagement
  • Cost Optimisation
  • Terraform
  • Git
  • Agile

Certification

  • AWS Certified Solutions Architect – Professional, 09/01/20
  • AWS Certified DevOps Engineer – Professional, 03/01/20
  • AWS Certified Solutions Architect – Associate, 09/01/20
  • AWS Certified SysOps Administrator - Associate, 03/01/20
  • AWS Certified Developer – Associate, 09/01/20

Summary - Purpose

I am a passionate and driven data professional dedicated to crafting innovative solutions that bridge the gap between technical complexity and business strategy. With deep expertise in data architecture, modelling, and engineering, I thrive on solving challenging problems to enable organisations to unlock the full potential of their data. Driven by a commitment to technical excellence, I combine hands-on expertise with mentoring and empowering others to create meaningful impact. My vision is to grow as a leader who builds systems that simplify complexity, bridging the divide between data and decision-making to deliver clear and actionable value.

Timeline

Senior Data Engineer

Solution BI
07.2024 - Current

Data Platform Engineer

Suncorp Bank
02.2024 - 07.2024

Senior Data Engineer

BHP
01.2023 - 11.2023

Cloud Data Engineer

Genie Solutions
09.2019 - 12.2022

Data Engineering Consultant

CMD Solutions
02.2019 - 08.2019

DevOps Consultant

Deloitte Australia
09.2017 - 01.2019

Bachelor of Science -

Federal University of Goiás
  • AWS Certified Solutions Architect – Professional, 09/01/20
  • AWS Certified DevOps Engineer – Professional, 03/01/20
  • AWS Certified Solutions Architect – Associate, 09/01/20
  • AWS Certified SysOps Administrator - Associate, 03/01/20
  • AWS Certified Developer – Associate, 09/01/20

Postgraduate Certificate in Management -

Macquarie Graduate School of Management

Postgraduate Diploma in Software Engineering -

Catholic University of Brasilia
Thiago Oliveira