12+ years of overall experience in both leading and contributing individually across Big Data, Data-warehouse,Data Science, IoT, Cloud, PlatformOps, DevOps, Cloud Security, API and Microservices related Technologies.
2+ years as a Senior Manager , managing 50+ team size.
10+ years of proficiency in various Big Data technologies like Cloudera and IBM Big-insights processing Terabytes of data at scale.
10+ years of experience in Datawarehousing (ETL/BI) and related technologies working E2E which includes making sense out of raw data from heterogenous sources and building visualizations.
7+ years of experience working in Cloud,PlatformOps,Devops,Automation,API and Microservices related technologies.
5+ years of experience in designing E2E Solution Architecture across various domains that include Insurance, Banking and Telecom.
6 months of experience working in Data Science related projects.
5+ years of managing all product delivery schedules, specification reviews, planning, budgeting,cost management and resource allocation.
Expertise in AWS, building Platform Architecture , Cloud Security and Automated CI/CD pipelines.
Successfully set up DaaS,IaaS,PaaS and other Cloud Services by efficiently designing cost effective solutions.
Expertise in Configuration Management tools like Puppet and Chef.
Experienced in ETL tools like Ab Initio, Informatica and Datastage and also Visualization tools like Tableau, Microstrategy and Cognos.
Recruited,mentored and appraised peers. Directly managed up to 30+ associates.
Effectively managed On-Site, Off-Shore model both from US-India and from Australia-India.
Designed training and development programs to increase quality of personnel.
Held responsible for the design, integration, and scheduling of all third-party product integration and support.
Led effort to develop company-wide best practices, standardized specification guidelines, and inter-departmental communications.
Strong analytical and problem-solving skills and can work either independently with little or no supervision or as a member of a team.
Reporting directly to APAC VP of Cognizant in terms of bench resource allocation, new client wins,presenting proposals and POCs.
Good written and verbal communication skills, strong organizational skills and a hard-working team player,well-practiced in attending phone calls and answering business team queries.
Overview
17
17
years of professional experience
1
1
Certification
Work History
Senior Manager - IoT and Digital Engineering
Telstra Corporation Limited
Melbourne, VIC, AUS
07.2018 - Current
Technologist in IoT (Internet of Things), HPSE (High performance Software Engineering), M2M (Machine to Machine) and Networks Digital Engineering group at Telstra Corporation Limited.
Currently working as a PlatformOps/Devops Architect and Lead Data Engineer providing solutions in building highly available, cost effective, resilient and de-coupled architecture for the most strategic project in IoT/M2M space.
Designed and architected data ingestion & augmentation from various new and legacy data sources, streaming data and also wrote pyspark for massive parallel processing TeraBytes of data using AWS Glue jobs.
Build DevOps ecosystem with fully automated CI/CD pipeline process on AWS MicroServies platform using Kubernetes and AWS native components across Dev,UAT and Prod tenancy.
Designed an architected streaming IoT machine generated data ingestion from Kafka using an ETL Tool called NodeRed and ingested in Redis Elasticache.
Set up various AWS IaaS like WAF, Bastion, EC2, ALB, NLB, ASG, Security Groups, Route Tables, IGW.
Expertise in setting up DaaS in AWS including Aurora Postgres,DynamoDB,MongoDB,Glue job and Redis.
Expertise in setting up other AWS Services including API Gateway, Cognito, SNS, SSM, Sagemaker,Codepipeline,Codecommit,Codebuild,ECS,ECR.
Providing updates in the Governance meeting to the Executives and Stakeholderds at the end of every sprint, on the project road-map and the feature prioritization for the most strategic, high-visibility in-house product development.
Perform developer role as well, especially for AWS and Data Lake building to always keep my hands dirty in playing with technologies.
Built both On-premise and Cloud CI/CD pipeline for a seamless and zero downtime deployment by adapting Blue-Green approach.
Successfully implemented Code Security scanning, Docker Image Security Scanning by effectively leveraging tools like Sourceclear, Coverity, Hawkeye, Clair, Aquasec etc.
Collaborated with the third party vendor like Solace (Queuing Service) and also with other Cloud Solution Providers like Azure/Nokia Cloud to understand the Cost Effective Solution across various CSPs.
Worked with the Cloud Security Team in Telstra in building secured services based on the Industry standards.
Worked actively with the Penetration testing team semi-annually to check for vulnerabilities in the Production Front-End, APIs and Back-End Systems.
Worked with the Product-Owner to prioritize the features based on the Customers requirements.
Worked efficiently in the Agile environment as a Feature Lead and also as a Scrum Master in conducting Sprint Planning, Stand-ups, Defining Epics/Stories with Product Owner and Sprint Retrospective.
Manager Big Data/DevOps Consultant
Geico Inc
Chevy Chase, Maryland,USA
02.2015 - 07.2018
Worked as a Lead Engineer/Manager in both Big Data and PlatformOps space.
Collaborated with the Systems Engineering team and successfully migrated on-premise data-center to cloud, for the purpose of building a data lake.
Managed a team of 20+ from both On-shore(USA) and Off-site(India) and efficiently delivered sprints in both PlatformOps and Big Data space.
Worked closely with the executives and stake holders on sharing the updates with each release.
Automated the continuous integration and deployments using Jenkins.
Installed, Configured, Managed Monitoring Tools such as Splunk for Resource Monitoring/Network Monitoring/Log Trace Monitoring.
Designed an architectural flow on migrating 30+ years worth of data from legacy Data-warehouse to the Data Lake which will be consumed by the Data Scientist for building models and also to perform Dead Data Analysis.
Closely worked with the Data Scientists and efficiently sliced and diced the data based on their business needs for a very low latency data retrieval.
Successfully implemented Configuration management automation across a cluster of 200 nodes with different Linux OS to perform version upgrades or implementing patches.
Setup Chef Server, workstation, client and wrote scripts to deploy applications.
Managed cookbooks in Chef and Implemented environments, roles, and templates in Chef for better environment management.
Created and implemented chef cookbooks for deployment and also used Chef Recipes to create a Deployment.
Used Jenkins to automate the build process and deploy the application to NGINX server.
Implemented Microservices architecture by deploying docker containers and sharing the API endpoints to the developers.
Worked with external vendors like Credit Sussie and Equifax for credit check related data.
Worked with Big Data Solution vendors like Cloudera and IBM BigInsights to understand the features that are offered and decided the ideal Big Data solution for the given use case.
Effectively driven the POC on selecting the Org level ETL tool for processing massive amounts of data and have shortlisted Talend for Big Data after analysing couple of other tools in various attributes.
Working knowledge in Visualization tools like Tableau, Microstrategy and Cognos to build interactive reports and dashboards.
Implemented Agile in the team to deliver Potential Shippable Product(PSP) at the end of each sprint.
Automated the deployment process by building CI/CD pipeline for each layer across the stack.
Recruited developers and leads in both On-Shore and Off-Shore.
Enable the team for external technical training's across multiple technologies to ensure cross-pollination is effectively emphasized.
Lead Big Data/Senior DevOps Engineer
Cognizant Technology Solutions
Richmond, Virginia,USA
04.2012 - 02.2015
Develop and implement an automated Linux infrastructure using Configuratoin management tool using Puppet.
Configured Jenkins CI/CD for doing builds in all the non-production and production environments.
Written cookbooks to install JDK and WebLogic. Managed roles, environments, data bags, cookbooks, and recipes in Chef.
Configured Docker containers to build Microservices based de-coupled services.
Experience with shell scripts for automating tasks.
Coordinate/assist developers with establishing and applying appropriate branching,labeling/naming conventions using GIT source control.
Handled JIRA tickets for SCM Support activities.
Worked as a 24/7 support engineer to perform hot-fixes and debugging the production issues.
Creating user level of access for related Git Hub project directories to the code changes.
Performed all necessary day-to-day GIT support for different projects.
Provided support for the smooth phase of Release for the Emergency and Expedite Releases by getting the Director Level Approval and coordinating with different teams.
Gathered all the stakeholder approvals, necessary sign-offs while acting as a release manager for two development teams.
Monitored multiple Hadoop clusters environments using Control-M, monitored workload,job performance and capacity planning using Cloudera Manager.
Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports by Business Intelligence tools.
ETL Lead
Tata Consultancy Services
Chennai, India
03.2011 - 04.2012
Worked as an Ab-Initio ETL Lead by managing a team of 6.
Supported in Gathering requirement and understanding the business functionality in case of TPR (Technical Project Request) document.
Develop the Ab Inito graphs according to the business requirements.
Analyze source systems file layouts and write DML for extracting the data from various sources like flat files, tables, Mainframe Copy Books, Responder Layouts.
Involve in analyzing the data transformation logics, mapping implementations and data loading into target database through Ab-Initio graphs.
Developing UNIX shell scripts for automation processes.
Involve in fixing the unit and functional test case/data defects.
Analysis of Existing application and identifying improvements.
24*7 support for the applications deployed in production.
ETL Developer
Hewlett Packard Global India Pvt Ltd
Chennai, India
01.2009 - 03.2011
Worked as an Ab-Initio developer to load the data to Datawarehouse.
Develop the graphs according to the TPR document.
Analyze source systems file layouts and write DML for extracting the data from tables.
Involve in analyzing the data transformation logics, mapping implementations and data loading into target
Database through Ab-Initio graphs.
Developing UNIX shell scripts for automating processes.
Involve in fixing the unit and functional test case/data defects.
Analysis of Existing application and identifying improvements.
Prepare result analysis graphs to validate business test case results.
Performance testing of all graphs against huge volumes of data.
Education
Bachelor's of Engineering - Electronics And Communication Engineering