Over 4+ years of IT experience in the Analysis, design, development, testing, and Implementation of ETL & Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, SQL SERVER, Azure SQL Cloud and all MSBI tools including Power BI. Gained Application experience in software development life cycle (SDLC), Agile, Scrum, UML, Waterfall and Project Management Methodologies. Experience in designing, developing, and deploying Business Intelligence solutions using SSIS, SSRS, SSAS, and Power BI. Experience in creating Power BI Dashboards (Power View, Power Query, Power Pivot, and Power Maps). Hands on experience in MS SQL Server 2019/2016, 2014, 2012 with Business Intelligence in SQL Server Integration Services, SQL Server Analysis Services and SQL Server Reporting Services. Experience on ETL methods for data extraction, transformation and loading in corporate-wide ETL Solutions and Data Warehouse tools for reporting and data analysis. Good Working experience in designing STAR, SNOWFLAKE Schemas, database modeling, logical and physical database design using Erwin and Knowledge of how the Kimball approach has been implemented in the Microsoft BI tool. Experience of the import, transformation, and business intelligence of data using Power BI (BI), With Power BI desktop, you can author reports, visualize data, automate report update, and produce and share dashboards based on reports. Proficient in data analysis and visualization using tools like Tableau, Power BI, and Excel, leveraging advanced features such as pivot tables, charts, and DAX queries to derive actionable insights. Experience in extraction, data cleaning, statistical modeling, and data visualization with large data sets of structured and unstructured data. Extensive experience with various data processing platforms and languages including Oracle PL/SQL, SQL Server T-SQL, MySQL and PostgreSQL. Gained experience and Implemented Row Level Security (RLS) as part of security in Power BI. Experienced in working closely with stakeholders, gathering requirements, and translating business needs into technical solutions. Gained experience of DAX (Data Analysis Expressions), as well as the capacity to create intricate calculations and measurements to satisfy analytical requirements. Skilled on NoSQL and Big Data technologies such as Hadoop, MongoDB, Cassandra. Hands on Experience in Migrating SQL database to Azure data Lake, Azure data lake Analytics, Azure SQL Database and Azure SQL Data warehouse and Controlling and granting database access and Migrating On premise databases to Azure Data lake store using Azure Data factory. Proficiency in version control using Git, enabling efficient collaboration, and tracking of code changes throughout the project lifecycle. Gained knowledge of how to automate the build, test, and deployment processes for software projects needing continuous integration and delivery (CI/CD) using Jenkins and Azure DevOps. Experience in design and implement reports and visuals, drill through page, Tree Map, Funnel chart, Line Chart, KPI scorecards, slicers, buttons, bookmarks, dashboards and custom visuals. Experience to provide process automation, workflow, and dashboard display framework, Tableau Desktop was used in the design and development of visual components. Extensive experience in text analytics, creating diverse statistical data mining, and data visualizations utilizing R, Python, and Tableau solutions to numerous business problems. Competent in writing PySpark, SQL, and Python queries in Azure Data Bricks to perform transformations on the data. Familiarity with agile methodologies like Scrum and experience using Jira for project management and task tracking. Experience in coding Unix/Windows scripts for file transfers.
Company: S& P
Client: KPMG
Role: Power BI Developer Duration: Nov 2021 to Till date
Description:As part of the data innovation team focusing on developing analytics-based products and solutions as part of client service industry. My responsibility is to build scalable and interactive Business Intelligence solutions that will serve the client in driving Business Insights. The role involves working independently in an Agile approach while collaborating with counterparts across workstreams to develop automated analytics business solutions
Responsibilities:
· Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.
· Worked on how to handle security according to needs when creating reports, charts, and dashboards using Power BI Desktop.
· Worked on creating Power BI Dashboards (Power View, Power Query, Power Pivot, Power Maps).
· Used Power Query, transform, define, and browse the data from the data source and execute merge and append queries.
· Used T-SQL constructs for data querying, such as joins, sub queries, derived tables, and views.
· Worked on many DAX functions have been implemented for various fact calculations to facilitate effective data display in Power BI.
· Used detail level summary report sets, groups, hierarchies, backdrop images, maps, trend lines, computations, advanced analytical activities, and KPI dashboards.
· Responsible for optimizing queries which take longer time in execution with SQL Server 2019/2016/2012.
· Experience of a range of data processing languages and systems, including Oracle PL/SQL, SQL Serve T-SQL, MySQL, and PostgreSQL.
· Extensive experience with Snow-Flake Modeling, FACT & Dimension Tables, Physical and Logical Data Modeling, Star Schema Modeling, and Data Warehouse.
· Worked on Data Modelling and Data Mining tomodel the data as per business requirements.
· Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in Azure Databricks.
· Developed a variety of joins (inner, outer joins) and complicated SQL queries in Access to retrieve the required output for data analysis. Used a variety of sources, including SQL Server, Excel, the cloud, SQL Azure, etc., to import data into Power BI.
· Primarily involved in Data Migration using SQL, SQL Azure, Azure storage, and Azure Data Factory, SSIS, Power Shell.
· Implemented row level security for the tableau dashboards and used various advanced table calculations for the dashboards
· Implemented version control using Git for ETLprocesses, data mapping, analytics scripts, dashboard development, and report templates, ensuring collaboration and tracking changes.
· Utilized Git as the version control system for source code management, ensuring efficient collaboration, and change management.
· Involved in migrating Tableau reports to PowerBI with multiple data sources like Salesforce and web portals. Developed custom visuals using Python and R languages.
· Used SQL, Python, and PySpark, worked Data bricks notebooks were generated, and automated notebooks were made using tasks.
· Used for testing and data processing, Scala and Spark-SQL were used to develop Spark.
· Worked on track record of modifying Jiraprojects with a wide range of schemes, complex processes, flows, authorization schemes, and notification schemes.
· Actively applied Agile Methodology practices for development of the Microsoft reporting environment.
Environment: Power BI, Tableau, SQL, Azure Data Factory, RLS, Git, Python, Pyspark, Scala, Jira, Excel, My SLQ, PostgreSQL, KPI, Snowflake, Star, T SQL, DAX, Power Desktop, View, Pivot, Query, Maps, Agile, ETL.
Company: EPAM
Client: IQVIA
Role: BI/SQL Developer Duration: May 2019 to Nov 2021
Description:IQVIA Consumer Health Ecomm and its BIDS team (Business Intelligence and Data Science) continuously invest in ways to help life science companies find the next breakthrough in improving human health. Building Industry benchmarking to compare financial and operational performance, and target areas for improvement. Created Datawarehouse and Reports performing IQVIA data analytics, Projection, Imputation and quality check.
Responsibilities:
· Designed, developed, modified and enhanced the database structures and database objects. Data warehouse and data mart designs efficiently support BI and end user requirements
· Used Power Query to implement ETL Methods on the PowerBI desktop, combine diverse data sources, including SQL servers, spreadsheets, and external data obtained through web services, into a single, centralized data warehouse.
· Designed Power BI data visualizationutilizing cross tabs, maps, scatter plots, pie, bar and density charts.
· Daily tasks included processing client-supplied external files and, on occasion, transferring data across servers using Older DTS Packages in a SQL Server 2000/2005/2008 /2016environment.
· Experience on PL/advanced SQL features, such as Records, Tables, Object types, and Dynamic SQL, extensively.
· Designed the structure and relationship between the fact table and dimension table enabling end- users to have in and be able to produce their own Self BI Reports by developing the Data model in the form of a Star Schema.
· Designed Extraction Transformation and Loading (ETL) process using SSIS to extract data from flat files, excel files and SQL server database.
· Created Power BI reports using DAX using a variety of data sources, including Microsoft SQL Server, MySQL DB, SharePoint, and Excel, and published the BI solutions to the Power BI service.
· Developed dashboards in the Tableauenvironment and published them to the server using Tableau Desktop.
· Created data visualizations using various visual objects (line chart, bar chart, tree map, single card, KPI, map, pie chart, custom visuals).
· Involved in creating the external and internal tables in Azure SQL Data warehouse and created stored procedures to move the data from external to internal tables.
· Developed automated reports and scheduled updates using Tableau's publishing and scheduling features, providing stakeholders with up-to-date insights.
· A Python script was created to process big data sets and to extract data from various files.
· Employed JIRA as a tracking tool for the sprints, participated in daily SCRUMmeetings, sprint planning, showcases, and retrospective, and adhered to the agile methodology.
· Developed various UNIX shell scripting for data scrubbing to load monthly loads in Redshift Data ware house.
Environment: Power BI, Power Query, ETL, SQL Server, PL SQL, Star, SSIS, DX, Git, Azure, Tableau, Python, Jira, Unix Shell, Agile, Scrum, My SQL, Excel.