Summary
Overview
Work History
Education
Skills
Timeline
Generic

Dijin Thomas

Data Scientist
Chatswood,NSW

Summary

Data Scientist familiar with gathering, cleaning and organizing data for use by technical and non-technical personnel. Advanced understanding of statistical, algebraic and other analytical techniques. Highly organized, motivated and diligent with significant background in Big Data Technologies.

Overview

4
4
years of post-secondary education
11
11
years of professional experience

Work History

Data Scientist

Westpac (Advanced Analytics - Group Startegy)
Sydney
07.2019 - Current
  • Provided comprehensive analysis and recommend solutions to address complex business problems and issues using data from internal and external sources and applied advanced analytical methods to assess factors impacting growth.
  • Scaled analytical capabilities across all business areas, evolving analytics to influence bank's strategic planning and executives' decision-making.
  • Evaluated Auto Income Verification for Mortgage Application to reduce TTR for Westpac MFI customers through CSH ,Identified low risk Mortgage Applications to be prioritized to reduce TTR.Identified options to increase Straight Through Processing of Un-Secured Credit Card applications for existing Customers.
  • Created Data Product for all customers interactions through all channels.
  • Worked on MultiBrand & MultiBranch analysis for WBC & SGB.
  • Analyzed performance of CC vs BNPL for existing customers and prototyped approval process for BNPL product for WBC.
  • Evaluated benefits of Fixed vs Variable rate for Secured Lending.
  • Prototyped Transaction Categorization for transactions from OFI/Open Banking.
  • Analyzed performance of Third Party mortgage broker queue.
  • Analyzed different types Interactions and impacts of branch closure at rural , regional & urban areas. Evaluated Customer Digital interactions behaviors Pre & Post closure of home branch.
  • Covid-19 impact on affluent vs non-affluent customers, impact on customers savings.
  • Dataiku tool evaluations , internals training sessions with vendors.

Big Data Consultant

Westpac (Accenture)
Sydney
03.2018 - 07.2019

FSV - ICM & Lenders Assist

Income Confidence Measure provides a confidence indicator on differenr Customer Income. ICM is calculated based on the statistics on the Customer's income pattern, frequency, and regularity from all the sources including Salary, Superannuation, Investments, and other Incomes. The confidence measure is used to auto approve PL/credit card for High valued/rated customer.

Transactions Categorization

Transactions categorization gives a category to all the transactions done by all the customer from the Westpac group. Categorized data is being used by multiple use-cases including Compliance, lending, and customer risk profiling.

  • Demonstrated consulting-based leadership experience on specialized projects and displayed overall knowledge of information systems.
  • Applied statistical and algebraic techniques to interpret key points from gathered data.
  • Collaborated with internal stakeholders, identifying and gathering analytical requirements for customer, product and projects needs.
  • Led projects and analyzed data to identify opportunities for improvement.
  • Worked closely with business executives & Senior Data Scientist to define ICM rules.
  • Defining Transaction Categorization rules in collaboration with business.
  • Developing and refining code to reach required match percent requirement.
  • Working with relevant users on business requirements.
  • Coding in Python using Spark and Pandas on Big Data Platform.
  • Coding in SQL over Big Data using SprakSQL using Python.

Big Data Specialist

HSBC (Infosys)
London
10.2015 - 03.2018

Customer Due Diligence and KYC:

CDD/KYC deals with identifying each Customer's aggregated transaction summary to re-risk rate a customer. It helps to better monitor customer activities and identify anomalies.

Common Payments Model :

Common Payment model acts as a Single Golden data source for HSBC payments data from multiple data sources. The model enabled business users to view and build reports over the multiple Payment source system together from multiple countries to get a consolidated view of Payments data.

  • Worked closely with Business stakeholders , chief data officer and senior architects to maintain optimum levels of communication to effectively and efficiently complete projects.
  • Programming in Python and Java to create Customer summary on monthly Credit and Debit transactions.
  • Writing queries on Hadoop Platform using Apache Hive, Apache Pig ,shell scripting and Python programs to set risk parameters to identify irregular patterns in transaction.
  • Creating visualization using Tableau and Platfora visualization reporting tools.
  • Programming in Python to aggregate data from multiple sources and transform it into required format.
  • Lead offshore team of 5 members.

Big Data Hadoop Developer

Apple (Infosys)
Hyderabad, India
06.2014 - 10.2015

Apple Law Enforcement & Compliance:

Apple Law Enforcement & Compliance team need to generate Batch and Adhoc reports for requested DSID, IP and Machine address as and when required by the Governing body and the compliance team. These reports enable Apple to comply in with the regulatory bodies.

Apple iTunes Ops migrations.

Paper Tiger set of reports generates a monthly report of different purchases done in Apple
iTunes for different USD ranges. The reports were initially implemented in Teradata but
were moved to Hadoop/Hive for better cost efficiency.

  • Met with key stakeholders to discuss and understand all major aspects of project, including scope, tasks required and deadlines
  • Writing SQL queries on Hadoop Platform using Apache Hive to generate reports for compliance reporting.
  • Creating visualization using Tableau.
  • Worked with client/onsite team collaboratively to deliver project.
  • Developed highly maintainable Hadoop code and followed all best practices regarding coding.
  • Inspected and analyzed existing Hadoop environments for proposed product launches, producing cost/benefit analyses for use of included legacy assets
  • Supervised Hadoop projects and offered assistance and guidance to junior developers.
  • Performed data cleaning on unstructured information using various Hadoop tools

Senior Developer

Horizon Blue Cross Blue Shield-USA (Infosys)
Hyderabad, India
07.2011 - 06.2014

Horizon Blue Cross Blue Shield of New Jersey
Horizon Blue Cross Blue Shield of New Jersey is a leading Health Insurance provider in the New Jersey. Payments claims are being generated in the TSO mainframes screen. ClearQuest is used as a ticketing tool and tickets are assigned to the team for claim mismatch and other customer related issues.

  • Java Coding and writing unit Test Cases.
  • Writing SQL queries and Procedures in DB2.

Systems Engineer Trainee

Infosys Tech.Ltd
Mysore, India
02.2011 - 07.2011

Trainee Engineer

Cognizant Technologies
Chennai, India
11.2010 - 02.2011

Education

Bachelor of Science - Information Technology

SD Bansal College of Technology(RGTU Bhopal)
Indore , Madhya Pradesh , India
09.2006 - 06.2010

Skills

    Data Driven Strategy

undefined

Timeline

Data Scientist

Westpac (Advanced Analytics - Group Startegy)
07.2019 - Current

Big Data Consultant

Westpac (Accenture)
03.2018 - 07.2019

Big Data Specialist

HSBC (Infosys)
10.2015 - 03.2018

Big Data Hadoop Developer

Apple (Infosys)
06.2014 - 10.2015

Senior Developer

Horizon Blue Cross Blue Shield-USA (Infosys)
07.2011 - 06.2014

Systems Engineer Trainee

Infosys Tech.Ltd
02.2011 - 07.2011

Trainee Engineer

Cognizant Technologies
11.2010 - 02.2011

Bachelor of Science - Information Technology

SD Bansal College of Technology(RGTU Bhopal)
09.2006 - 06.2010
Dijin ThomasData Scientist