Summary
Overview
Work History
Education
Skills
Certification
Work Availability
Quote
Timeline
Generic
Paresh Bapna

Paresh Bapna

Melbourne,Victoria

Summary

Strategic TOGAF Certified Solution Architect eager to utilize expertise in process evaluation and optimization demonstrated over 18+ years of Big Data/Cloud Solution superior industry performance. Being a Meticulous Systems Engineer with extensive experience on web/e-commerce and product development, and recently into enterprise transition from OLTP to OLAP. Hands-on data design, modelling and PySpark coding/low level design experience. Last 7+ years into Azure/AWS Cloud, BigData and DevOps using Java, PySpark and previous 10 years into Application and New Product Development using Data-Structures, Java, SQL, and Oracle Middleware technologies.

Overview

18
18
years of professional experience
1
1
Certification

Work History

Resident Solution Architect

Databricks
Melbourne, Victoria
03.2020 - Current
  • Clients: Multiple
  • Product: Databricks
  • Key Roles And Responsibility
  • As a RSA within Databricks have been responsible to perform various roles and responsibilities for customers using Databricks as an product and AWS/Azure/GCP as clould technologies
  • Involving Client’s Enterprise Architectural reviews, Providing insights into correct Architectural decisions, Helping customers to provide baseline vs target Architecture state
  • As a designer provide low level implementation details on the suggested target Architecture, suggest implementation plan
  • Provide Network, Performance, Scalability, Infra Road map
  • Raise key future blockers over such issues to various personas including but not limiting to Enterprise Architects, Delivery Manager, Lead Architects, Key Sponsor leadership teams and Head to the Departments under various functions
  • Current Role also involves hands-on implementations of IAC [terraform preferably], varied Databricks features including Implementing complete ETL pipeline using Bronze, Silver, Gold Architecture
  • Suggesting and implementing various tech stack to implement big data pipelines, including data design, gaining insights into data and performing data analysis to gain meaningful insights into data for its correct implementation
  • It is then followed by CI-CD integration frameworks, and NFR matrix involving Performance enhancement, key Security aspects including platform, data, network, cloud, identity and data transition
  • This role involves handling multiple customers, stakeholders at a time and hence resulted in profound insights into various customer’s Architectural, design and implementation practices their limitations and their forte.
  • Provided support for database upgrades and software deployments.
  • Evaluated operational processes and recommended cost-effective solutions.
  • Tailored and presented IT solutions to principles, CXOs and managers across various region, resulting in 15% in new business.
  • Defined technical integration strategy and developed integration plans.
  • Built and oversaw network infrastructure comprised of various virtual products.
  • Coordinated and enhanced existing databases and established new databases as part of initiative.

Sr. Big Data Architect

Cognizant Australia
06.2021 - 03.2022
  • Orchestration Tools: ADF, Airflow, DB-Workflows
  • Big Data Modelling/Review Tool: Sparxs, Erwin
  • TOGAF Certified solution engineer, providing banking solutioning, consultations and governance lifecycle support for Westpac bank
  • Worked as:
  • As Expert advice and support
  • Responsibility for understanding client requirements and using architecture analysis methods to identify, evaluate and recommend appropriate solutions
  • Undertook impact analysis on major design options and assesses and manages associated risks, cost, and time to deliver
  • Reviewed other supporting project and non-project documents and provide support to Project Control Boards and the Architecture Advisory Board
  • Collaborated with, and facilitated stakeholder groups, as part of formal or informal consultancy agreements
  • Address client needs through collaborative solution development and ensuring that proposed architectures are aligned to functional and non-functional requirements
  • Conducted analysis of business requirements and maintains the ongoing balance between functional, service quality, security and systems management design requirements
  • Review architecture and system designs to ensure selection of appropriate technology, efficient use of resources, and integration of multiple systems and technology.
  • Collected, outlined and refined requirements, led design processes and oversaw project progress.
  • Wrote and coded logical and physical database descriptions, specifying identifiers of database to management systems.
  • Promoted high customer satisfaction by resolving problems with knowledgeable and friendly service.
  • Planned and completed group projects, working smoothly with others.

Big Data Architect

Wipro: Telstra(Client)
Melbourne, Victoria
02.2019 - 05.2021
  • Orchestration Tools: Airflow, Azure Data Factory, Databricks
  • Big Data Modelling/Review Tool: Sparxs
  • Big Data Profiling Talend
  • Language Proficient: Java, Py-Spark
  • Telstra Cloud Migration (Cloudera Azure) (Ongoing)
  • Key designer for ‘Datacore’, the cloud migration team
  • Designer for Cloud migration program, which was integrator for key upstream systems like SIGMA(Product catalog), Salesforce(CRM,Order Capture), Amdocs(Fulfillment), Aria(Billing), SNOW(Assurance) and Legacy Systems(RAPTOR, MICA..) using Microsoft Azure Cloud
  • Responsible for designing and architecting key migration strategies including security, network design, network security, IAM security, data security, security monitoring
  • Involved with other architects and designers to finalize network layer security to communicate to in-house applications (VMs) to cloud (Azure) using SAS tokens, azure firewall designing (hub-spoke model), outbound and inbound traffic policies using NSGs and usage of FQDN, implementation of application gateways
  • Worked as Data modeler for the semantic layer Facts/Dims Models using design patterns like Snowflake schema, Star schema, Galaxy Schema as per use-cases and reporting requirements
  • Azure identity & access management Responsible to design security for resource management using AAD authentication > Conditional access policy > Role Based Access Control (RBAC) > Azure policies (Azure governance) Responsible for designing Resource Providers
  • Azure governance Responsible for architecting Azure governance: Azure policies, deployment policies at management group level, subscription level and at resource level
  • Enabling core sprint based development starting from discovery, elaboration, evolution and retrospections.

Big Data Architect

Wipro: SouthEastWater(Client)
Melbourne, Victoria
02.2019 - 12.2019

Client: SoutEastWater Cloud Implementation & Migration

  • Created conceptual, logical and physical data models for use in different business areas.
  • Developed and managed enterprise-wide data analytics environments.
  • Identified, protected and leveraged existing data.
  • Worked as part of project teams to coordinate database development and determine project scopes and limitations.
  • Demonstrated leadership by making improvements to work processes and helping to train others.
  • Prioritized and organized tasks to efficiently accomplish service goals.
  • Designed customer’s IOT cloud migration for metered devices
  • Key use-case involves integration with meters, IOT devices, small but frequent chunks of timeseries data
  • Designed jobs to interact with IOT Telco meter data and ingested the data to real-time streaming systems
  • Implemented workflow management to invoke eventhub, function app and spark transformation functions to ingest data into NOSQL DB
  • Implemented device interactions for : Firmware updates, Bootstraps, periodic updates from field devices: LWM2M
  • Implemented time series and non time series transaction data using API management
  • Key point of contacts for Data Architecture decision: data, data attributes, data domains, data flow, data transformation, data passing through varied integration layer
  • Designed, implemented algorithm for first time in use of Business Rules Configuration Based Development, enabled business and bank to achieve strategic implementation of banking projects
  • Responsible to generate Mapping sheets, writing engineering team implementation algorithms, Design documentation, design communication and hence interfacing between business, BAs and Engineering teams.

Development Lead

Reliance Jio
Mumbai, Maharastra
02.2015 - 01.2019
  • Designed and Involved in Big data layered design referrings lambda architecture.
  • · Responsible for delivery end to end lifecycle of Big Data Platform Using open source databases technologies including Apache No SQL DB HBase, Hive, Spark, Jenkins.
  • Inflow of Petabyte of network and CRM data infused through Kafka consumer into Elastic search, Hive, HDFS which are produced by Kafka producers.
  • Developed solutions for Live streaming via Kafka queues, and using multithreading and blocking queues at consumer end.
  • Lead for bigdata project developed in spark and later migrated to azure cloud for feeds coming across various telecom channels.
  • Developed comparative POCs on HBase vs Hive, MPP vs Datawarehouse/ various data ingestion possibility corroborated by its advantages and disadvantages, using Stored procedures vs Entity frameworks
  • Responsible for scalable and resilient design of HBase data ingestion and integrating with Azure IOT/Event Hub. No-SQL DB design: Tall-Narrow vs Wide design on need basis.
  • Developed a generic Spark framework for handling various types of input formats like Avro, Bytes, Delta, Delimited.
  • Responsible to big data performance tuning practices using spark bucketing, pruning, partitioning, joining strategies including but not limited to broadcast joins, shufflehash joins, sortmerge joins.
  • Generate detailed documents for the logical and physical data model, including design and metadata information.
  • Developed comparative POCs on HBase vs Hive, MPP vs Datawarehouse/ various data ingestion possibility corroborated by its advantages and disadvantages, using Stored procedures vs Entity frameworks.
  • Suggested and offered specific training programs to help workers maintain or improve job skills.
  • Negotiated contracts with clients for desired training outcomes, fees or expenses.
  • Ensured that the design reflect data integrity, business rules and facilitate data integration.
  • Created mappings by the data mappers.
  • Provide guidance and support to the data mappers, technical team and testing team

Development Manager

Oracle Financials
Pune, Maharastra
06.2012 - 02.2015
  • Responsible for New banking product development analytical reports to help telecom make smart decisions on different issues.
  • Developed banking product named ‘Oracle Banking Platform’ > delinquency module implemented in ADF, SOA framework.
  • Responsible for implementation of Java key design patterns involving: Creational and behavioral patterns as per business use-cases.
  • Developed Java REST API framework using spring boot.
  • Perform requirement gathering, data modelling and drive end-to-end solutions for financial feeds.
  • Responsible to process huge banking transactional data using bach processing and XML (Pub-sub) based event-processing architecture.
  • Product used by NAB-Australia as their banking frontend having Oracle ADF interface.
  • Responsible to process huge banking transactional data using bach processing and XML (Pub-sub) based event-processing architecture.
  • Bagged best employee awarded for consecutive years for quick turnarounds and end-to-end implementation and driving the project to success.
  • Design reusable user interface leveraged across screens such menus, taskflows, UI templates, logging frameworks and UI side panels.
  • It also involved development of reusable API by middleware and backend services such as WCF Task Service.
  • Developed the common component for user interface, API.
  • SOA/BPM Implementation  Asynchronous communication  Designed Code that fetches data from SOA services and saves data via calling SOA, Implemented BPM process design workflows.
  • Coding for various modules and developing Business Logic. Created JSF TaskFlows, Security framework and design of JSF lifecycle from : JSF bindings(page def) – Managed/Backing Bean – AM – VO – EO – SOA- Middleware – DB APIs.
  • Coordinated with project leaders to resolve product issues.
  • Directed development team accountable for decision support and business intelligence (BI) for network engineers and client's business users.
  • Led 10 developers through all stages of project lifecycle in design and construction of financial data stores and business intelligence tools.
  • Collaborated with business and technical groups to align data warehousing project with business strategy, prioritize projects and launch data warehouses and data marts.
  • Participated in drafting and negotiating development contracts, providing input on factors such as turnaround time and potential ROI (Return on Investment).
  • Mentored team members to succeed and advance within department and company.
  • Planned and led training programs on staff development to enhance employee knowledge, engagement, satisfaction and performance.
  • Maintained corporate responsibility by staying up-to-date with laws affecting human resource training programs.

Project Lead

Emerson
Pune, Maharastra
04.2011 - 05.2012

Trellis, was developed in year 2012 when device to software UI interactions designs were naïve and innovative. This product was build to manage enormous datacenters owned by Emerson. Product helped to manage, control and instruct datacenter’s products like racks, servers, air-conditioners, cooling systems etc.

  • Responsible to use many new and radical design patterns and architecture using SOA, Event bus, dependency injections, entity framwork, XML parsers, custom XML to Object conversions using XSLT and XMLs JAXB parsers and ADF architecture for UI.
  • Worked extensively as team lead into building product which interacts with datacenter devices using matlab and converts data as events.
  • Implemented framework to read data from event queues to generate EDN pub-sub models
  • Implemented asynchronous SOA patterns to consume events and convert it into expected XML.
  • Developed Java XML parsers to convert XML into objects and later used these objects to transform them into entity framework relevant objects.
  • Designed Semantic data models for analytical reports to help telecom make smart decisions on different issues.
  • Work closely with the Architecture, BA, Business, Vendors to ensure the design follows the enterprise level standards.
  • Handled performance testing, load testing, system breakpoint testing and tuning of various enterprise application. Used testing tools like load generator, JMeter to simulate loads. Analyzed the load outputs reports and converted into business understandable metric to be reported to higher management.
  • Outlined work plans, assessed resources and determined timelines for projects.
  • Maintained close connection with project personnel to quickly identify and resolve problems.
  • Forecasted, scheduled and monitored project timelines, personnel performance and cost efficiency.

Software Developer

Oracle India
Hyderabad, Andra Pradesh
05.2008 - 04.2011
  • Responsible for Oracle Fusion AR Financials development. Oracle Apps, Application Receivables-AR, is a financial ERP module used for handling transactional receivable lifecycle. It is one of the key financial module of Oracle ERP system. Role involved to build some of the key sub-component of AR R-12 like late charges, iReceivables, invoices etc..
  • Worked extensively on Oracle forms, Oracle custom. plds, and Oracle custom .pls.
  • Designed and Developed complex algorithms for the calculations of Late Charges calculation a key module under Oracle Financial ERP AR product.
  • Migrated various Ora-Apps modules from forms and reports to Java and ADF framework
  • Was part of core financial module development of Ora-apps AR(Application Receivables) module.
  • Developed other Late charge modules of Oracle Financials i.e late payments calculations using late payment buckets and tires.
  • Delivered code to meet functional or technical specifications.
  • Modified existing software systems to enhance performance and add new features.
  • Developed CRM Common components like Task.
  • Designed Code which fetches data from SOA services and saves data via calling SOA WCF Task Service.
  • Used advanced Java concepts and design patterns like reflection API, Structural/Creational/Behavioral Design Patterns, JUnit,

Remote Debugging, development using Collections in depth.

  • Met with stakeholders, product teams and customers throughout system development lifecycle.
  • Informed project manager of milestone updates and provided detailed project reports.
  • Contributed actionable suggestions for project development during departmental meetings.

Software Engineer

Satyam (Now Tech Mahindra)
Hyderabad, Andra Pradesh
07.2005 - 04.2008
  • Analyzed solutions and coding fixes for software problems.
  • Developed web applications using variety of engineering languages.
  • Coordinated with project managers to meet development timelines and plan testing.
  • Conducted full lifecycle software development from planning to deployment and maintenance.
  • Reviewed and modified unit and integration tests to improve software quality and reliability.
  • Responsible for development of web-based application back in 2005. CSS application was WEB based Order taking Portal. LPG Customer Self Service (CSS) application. Web site has a Europe wide coverage. CSS is a commercially successful and business critical application. 100% of all the customer orders in the UK have been made through the application in 2005-06. It’s available to approximately 19000 customers in the UK and Portugal.
  • The CSS application is based on J2EE framework having n-tier architecture. Being compliant of J2EE specifications makes it easy to build and deploy scalable, distributed application. Weblogic Server and J2EE handle transaction services, security realms, guaranteed messaging, naming and directory services, database access and connection pooling, thread pooling, load balancing, and fault tolerance. This n-tier application provides the following services in each layer.
  • Used various design patterns including JSP  Struts  Business delegator pattern, business logic layer in EJB, data access layer in DAO.
  • Performed regression and performance tests for updated systems.
  • Liaised with QA testers to perform testing meeting various parameters.
  • Engineered cross-platform software and exported system performance data.
  • Met with stakeholders, product teams and customers throughout system development lifecycle.

Education

MBA - Finance

IIM K
Khozikode
03.2014

Bachelor of Engineering - Information Technology

Oriental Engineering, RGPV
Booral, QLD
07.2005

Skills

  • SKILLS ANALYSIS
  • Databricks Architect and expert
  • Computer Architecture
  • Systems Design Analysis
  • Software Deployment Support
  • Systems Design
  • Infrastructure Development
  • Code Development
  • Technical Analysis
  • Critical Thinking
  • Project Costing
  • Design Optimization
  • ETL – Design Pattern, Architecture
  • Cloud Frameworks/Migration Strategies – Azure, AWS
  • Languages - Python, Py-Spark, Java, Spring Boot
  • Databases - Oracle, SQL Server, Azure Synapse, HBase, Hive, Cosmos, DB, AWS-DynamoDB
  • Dimensional Modelling, Data Warehousing Design
  • Scripting Technology – Shell scripting
  • Analytics - Power-BI, Tableau
  • Machine learning – Regression, clustering
  • Big Data - Hadoop, HDFS, Hive, Pig, Sqoop, Spark, Kafka, cloud migration
  • Process Transition / Agile/ CI-CD/DevOps
  • Domains – Banking, Telecom, Retail
  • Azure Tech Stack
  • Testing: Unit, Integration, Performance, Blackbox, Automation
  • Page 4 of 13
  • Solutions deployment
  • Technology solution design
  • Network solutions

Certification

  • TOGAF Certified Architect
  • Azure/Databricks Certified Architect

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Quote

Judge a man by his questions rather than his answers.
Voltaire

Timeline

Sr. Big Data Architect

Cognizant Australia
06.2021 - 03.2022

Resident Solution Architect

Databricks
03.2020 - Current

Big Data Architect

Wipro: Telstra(Client)
02.2019 - 05.2021

Big Data Architect

Wipro: SouthEastWater(Client)
02.2019 - 12.2019

Development Lead

Reliance Jio
02.2015 - 01.2019

Development Manager

Oracle Financials
06.2012 - 02.2015

Project Lead

Emerson
04.2011 - 05.2012

Software Developer

Oracle India
05.2008 - 04.2011

Software Engineer

Satyam (Now Tech Mahindra)
07.2005 - 04.2008

MBA - Finance

IIM K

Bachelor of Engineering - Information Technology

Oriental Engineering, RGPV
Paresh Bapna