Zclus - Harish - Data Engineer
Zclus - Harish - Data Engineer
Professional Summary
Having 4+ years of experience in Microsoft Azure Cloud technologies.
Experience on Migrating SQL database to Azure data Lakestore, Azure data lake
Analytics, Azure SQL Database and Controlling and granting database access and
Migrating On premise databases to Azure Data lake store using Azure Data
factory
Created ResourceGroup, StorageAccount, Pipelines, datasets, linkedServices and
invoking the Pipeline using the PowerShell
Hands-on experience in Azure Analytic Services – Azure Data Lake Store (ADLS)
Gen1 and Gen2, Azure Data Factory (ADF) and (Basic)Azure Data bricks.
Arranged data integration pipelines in ADF using various Activities like Get
Metadata, Lookup, For Each, Wait, Execute Pipeline, Set Variable, Filter, etc.
Implemented dynamic pipeline to extract the multiple files into multiple targets
with the help of single pipeline.
Automated execution of ADF pipelines using Triggers, Runbooks
Have knowledge on Basic Admin activities related to ADF like providing access
to ADLS using service principle, install IR, creation of services like ADLS, logic
apps etc.
Extensively worked on data source types – SQL server, Oracle, flat-files, JSON,
CSV and Excel.
Designed Azure Logic app to send pipeline success/failure alert emails, files
unavailability notifications etc
Have documentation skills.
Pipeline execution methods (Debug VS Triggers).
Monitor and manage Azure Data Factory.
Involved in production support activities.
Basic Knowledge on Azure Data bricks, creation of Notebooks
Created the mount points to read data from ADLS Gen2 and ingest into the Azure
SQL DB using Azure Data bricks.
Extensively used ETL methodology for supporting Data Extraction
Knowledge on PowerBI
Education:
Completed my B.Tech in Electronics and Communication Engineering from Jawaharlal
Nehru Technological University
Certifications:
Microsoft Certified: Azure Fundamentals
Credential ID: 992064599
Credential URL : https://www.credly.com/badges/bbc9c49a-6b86-4f90-9939-
81e043a9f248
Professional Experience:
Currently working as a Software engineer in Saxon.ai
Technical Skills:
Cloud Platform: Microsoft Azure
Operating System: Windows, Linux
Languages: Basic python, SQL
Database: Sql server, Oracle, Azure DW, synapse
Tools: Azure Data Factory, Azure data lake Storage, Azure Data Bricks(basics)
Others: Jira, confluence, Azure DevOps, Power BI, Tableu, Sharepoint
Project 1:
Organization : Saxon.AI
Customer : ConcertGroup Inc
Period : November 2021 to February 2022
Description: : ConcertGroup Inc is a Insurance company. Concert Group was founded
in Chicago, Illinois by successful industry entrepreneurs
responding to significant fronting demand within their diverse client bases.
Develop database solution for Insurance dataset with master and reference
schema.
Implement efficient pipelines in the ADF to import data from the Local system to
Azure SQL.
Import and transform data using Power Query.
Created various Power BI Insurance Reports involving variety of features like
Bar Charts, Line Charts, Filters, Custom Visuals, Drill Down etc.
Built forecasting using parameters, trend lines and reference lines that were
reported manually before this solution.
Scheduled Automatic refresh and scheduling refresh in power bi service.
Added dynamically working measures for automating reports based on clients’
requirements
Implemented RLS.
Pivot/Unpivot Transformation.
Assisted customer with analysis of the Claims Incurred vs Premium Earned for a
year based on their different LOB.
Worked closely with the client for efficient and better results.
Environment – Microsoft Azure, Azure Data Factory, Azure SQL DB, Databricks,
Power BI, Azure Analysis Service,
Azure DevOps, Data Lake Storage, SSMS.
Project 1:
Organization : Saxon.AI
Customer : Insightbox (Internal product)
Period : February 2021 to June 2022
Description: : Saxon is a data and analytics company with end-to-end data
engineering and analytics services leveraging next-gen technologies from
leading market vendors. Our core differentiators are robust frameworks, strong domain
expertise, DataOps, and accelerators that provide quick time to value and maximize ROI.
Project 3:
Project 4:
Organization : MOLIPS
Customer : ONE(Shipping Logistics)
Period : July 2018 to December 2020
Description: MOL (Mitsui O.S.K. Lines) was launched in 1964. Mol is a Japanese
Transport Company headquartered in Japan. It is one of the largest shipping companies
in the world. MOL fleet includes dry cargo ships (bulk carriers), liquefied natural
gas carriers, Ro-Ro Car Carrier ships, oil tankers, container ships (among which MOL
Triumph is the 4th largest containership in the world), and container terminals. Focus
on containers shipping has been reduced since April 2018
Sending all export documents like Shipping Bill, Survey Report and other
documents to the agents for stuffing and loading the cargos on Vsl.
Supervising cargo operations, updating principals about vessel performance &
finalizing disbursement account.
Effectively liaising with port, terminal & statutory authorities for quick
deliveries of bulk consignments with no claims for damages or other causes.
Monitoring all pending clearances and delivery order collections; sending timely
reminders customers for ensuring that all sea bills, invoices and documents are
prepared accurately.
Created Pipeline’s to extract data from on premises source systems to azure
cloud data lake storage; Extensively worked on copy activities and implemented
the copy behavior’s such as flatten hierarchy, preserve hierarchy and Merge
hierarchy; Implemented Error Handling concept through copy activity
Exposure on Azure Data Factory activities such as Lookups, Stored procedures, if
condition, for each, Set Variable, Append Variable, Get Metadata, Filter and wait
Configured the logic apps to handle email notification to the end users and key
shareholders with the help of web services activity; create dynamic pipeline to
handle multiple Source extracting to multiple targets; extensively used azure key
vaults to configure the connections in linked services
Configured and implemented the Azure Data Factory Triggers and scheduled the
Pipelines; monitored the scheduled Azure Data Factory pipelines and configured
the alerts to get notification of failure pipelines
Created Azure Stream Analytics Jobs to replicate the real time data to load to
Azure SQL Data warehouse
Implemented delta logic extractions for various sources with the help of control
table; implemented the Data Frameworks to handle the deadlocks, recovery,
logging the data of pipelines.
Project 5:
Organization : 24/7
Customer : DirectTV Now(streaming service)
Period : July 2017 to June 2018
Description: DirectTv Now is a streaming service. DirecTV Stream is a family of
streaming multichannel television services offered in the United States by DirecTV.
Roles & Responsibilities:
(Dudam Harish)