GCP Lead Engineer
GCP Lead Engineer16
Applications
16
Applications
About the Job
Skills
Functional Title: Lead GCP Engineer
Location: MG Road
Start Date: ASAP
Work mode: 2 days/week in office
Job Description:
About the Role: We are seeking a Lead GCP Data Engineer who can demonstrate a broad range of technical skills across Google Cloud Platform (GCP) and third-party technologies. As a pivotal member of our team, you will be responsible for designing robust data architectures, developing efficient data pipelines, and optimizing data processing workflows on GCP. Your expertise will be instrumental in driving our data engineering initiatives forward, ensuring scalability, reliability, and performance.
Key Responsibilities:
- Data Architecture Design: Design scalable and efficient data architectures on GCP that meet the organization's data processing and analysis requirements. Collaborate closely with data scientists, business analysts, and stakeholders to define effective data models and structures.
- Data Pipeline Development: Develop and implement data pipelines using GCP services such as Google Cloud Storage, Big Query, Dataflow, and Pub/Sub. Ensure data quality, reliability, and governance throughout the data lifecycle.
- Data Transformation and Integration: Utilize technologies like Apache Beam, Apache Spark, and Cloud Dataproc to transform and integrate data from diverse sources. Perform data cleansing, aggregation, enrichment, and normalization to support downstream applications and analytics.
- Performance Optimization: Optimize data processing workflows to enhance performance and efficiency. Monitor pipelines, identify bottlenecks, and implement optimizations such as improved data partitioning, sharding, and leveraging GCP's autoscaling capabilities.
- Continuous Improvement: Stay abreast of advancements in data engineering and cloud technologies. Explore and implement new GCP features and services to enhance data processing capabilities and drive innovation within the team.
- Research and Innovation: Conduct research on emerging data engineering technologies, tools, and best practices. Evaluate new methodologies to improve data engineering processes and bring innovative solutions to the organization.
- Task Automation: Automate data engineering tasks using scripting, workflows, or tools like Cloud Composer and Cloud Functions. Streamline data ingestion, transformation, monitoring, and other operational processes to improve efficiency and reduce manual effort.
Attributes & Competencies:
- Education: BE/BTech, MTech, or MCA.
- Experience: Minimum 7+ years in development/migration projects, with at least 3 years focused on GCP. Experience in GCP-based Big Data deployments (batch/real-time) using Big Query, Bigtable, Google Cloud Storage, Pub/Sub, Data Fusion, Dataflow, Dataproc, and Airflow.
Technical Skills:
- Google Certified Professional Cloud Architect with experience automating and orchestrating workloads on GCP or other public clouds.
- Proficiency in at least one configuration management system (Chef, Puppet, Ansible, Salt, etc.).
- Strong programming skills in Python, Go, with expertise in Git and Git workflows.
- Demonstrated proficiency in CI/CD tools such as Jenkins, TeamCity, or Spinnaker.
Interested candidates can directly share at mgarg@kognivera.com
About the company
Industry
IT Services and IT Consul...
Company Size
51-200 Employees
Headquarter
Bangalore
Other open jobs from Kognivera IT Solution