company logo

Azure Big data Engineer | 4 to 8 Yrs

Bengaluru
Hyderabad
Hybrid
Mid-Level: 4 to 8 years
10L - 25L (Per Year)
Posted on Aug 20 2024

About the Job

Skills

Azure Data Factory
Azure Databricks
Azure Synapse Analytics
Big Data Technologies
SQL
Python

About the job Azure Big data Engineer | 4 to 8 Yrs | WFO |

Job Title: Azure Big Data Engineer

Years of Experience: 4 to 8 Years

Relevant Experience: 4+ Years

Headcount: 10

Job/Work type: WFO (5 days a week for First 3 months, later its Hybrid 3 days a week)

Location: Bangalore (preferred), Hyderabad

Time Zone: IST time zone (In IST 10:00 am to 7:00 pm)


Other details:


Role & Responsibilities / Job Description:

Mandatory Skills:

Azure Skills Like:

  • Azure Databricks
  • Azure Synapse
  • Azure Data Factory (ADF)
  • Azure Integration Services
  • Azure Functions
  • Azure Data Lake
  • Azure API Management
  • CI/CD and PowerShell Scripts
  • ETL Pipelines
  • Unity Catalog
  • Azure SQL Database
  • SQL Coding

Responsibilities :

  • Overall 4 8 years of experience with a minimum 4+ years of relevant professional work experience with Azure Synapse and Azure Databricks
  • ETL Consultant - Databricks / ADF / Synapse, building common logging mechanism for all ETL processes, Build the ingestion process for fact data tables, Building the pipeline temporary pipeline to load data into on premise, Prepare the flow diagram covering it up all required ETL's Prepare folder design document Netezza Box
  • Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure.
  • Work together with data analysts to understand the needs for data and create effective data workflows.
  • Create and maintain data storage solutions including Azure Databricks, Azure Functions, Azure Data Lake, Azure Synapse, Azure Integration Services, Azure API Management, CI/CD and PowerShell Scripts, ADF, ETL Pipelines, Unity Catalog, Azure SQL Database, SQL Coding, Azure SQL Database, Azure Data Lake.
  • Utilizing Azure Data Factory to create and maintain ETL (Extract, Transform, Load) operations using ADF pipelines.
  • Hands-On Experience working on Data Bricks for implementing Transformations and Delta Lake.
  • Hands-On Experience working on Serverless SQL Pool, Dedicated SQL Pool.
  • Use ADF pipelines to orchestrate the end to end data transformation including the execution of DataBricks notebooks.
  • Should have experience working on Medallion Architecture
  • Experience working on CI/CD pipelines using Azure DevOps
  • Attaching both ADF and ADB to DevOps
  • Creating and managing Azure infrastructure across the landscape using Bicep
  • Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data.
  • Improve the scalability, efficiency, and cost-effectiveness of data pipelines.
  • Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data.
  • Good to have exposure on Power BI, Power Automate
  • Good to have knowledge on Azure fundamentals.


Desirable skills

  • Good ability to anticipate issues and formulate remedial actions
  • Sound interpersonal and team working skills
  • Sound Knowledge of unit testing methodologies and frameworks
  • Should be able to work independently
  • Good communication skills

About the company

Ahvi Infotech empowers both global enterprises and fast-growing startups to effortlessly recruit remote developers and establish engineering teams seamlessly. Boasting a vast talent pool of over 2 million developers, Ahvi has facilitated the hiring of software engineers for more than 300 companies, including notable names such as SAP, Rolls-Royce, Johnson & Johnson, Samsung, Kellogg's, and Daimler ...Show More

Industry

IT Services and IT Consul...

Company Size

501-1000 Employees

Headquarter

SINGAPORE , SINGAPORE

Other open jobs from Ahvi Infotech