Role & responsibilities :
-
Understands requirements and is involved in the discussions relating to technical and functional design of the sprint/ module/project
-
Design and implement end-to-end data solutions (storage, integration, processing, and visualization) in Azure.
-
Used various sources to ingest data into Azure Data Factory ,Azure Data Lake Storage (ADLS) such as SQL Server, Excel, Oracle, SQL Azure etc.
-
Extract data from one database and load it into another
-
Build data architecture for ingestion, processing, and surfacing of data for large-scale applications
-
Use many different scripting languages, understanding the nuances and benefits of each, to combine systems
-
Research and discover new methods to acquire data, and new applications for existing data
-
Work with other members of the data team, including data architects, data analysts, and data scientists
-
Prepare data sets for analysis and interpretation
-
Perform statistical analysis and fine-tuning using test results
-
Create libraries and extend the existing frameworks
-
Create design documents basis discussions and assists in providing technical solutions for the business process
Â
Â
Preferred candidate profile
-
In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework
-
3+ years of overall experience with Azure, Data Factory and .Net
-
Strong in Data factory and should be able to create manual and auto trigger pipelines
-
Should be able to create, update, edit and delete ETL jobs in Azure Synapse Analytics
-
Recreate existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL data warehouse environment.
-
Knowledge of SQL queries, SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS)
-
Proven abilities to take initiative and be innovative
-
Analytical mind with a problem-solving aptitude