⚡ New

Data Engineer

EXL

AnandFull-timeMid LevelOn-site

Job Description

Role: Data Engineer – Google Cloud Platform (GCP) with SQL Location: Gurugram‑Hybrid Experience: 3–7 years Role Type: Full‑time Job Summary We are looking for a Data Engineer with strong experience in Google Cloud Platform (GCP) and SQL to design, build, and maintain scalable data pipelines and Lakehouse/warehouse solutions. The ideal candidate will be proficient in SQL, GCP services (especially BigQuery, Cloud Storage, Dataflow, and Cloud SQL), and will partner with data analysts, product, and business teams to ensure high‑quality, reliable data is available for reporting and analytics. Key Responsibilities Data Pipeline Development (GCP) Design and develop ETL/ELT pipelines on GCP using services such as BigQuery, Cloud Storage, Dataflow, Pub/Sub, and Cloud Functions .

Implement and maintain batch and streaming data pipelines that ingest, transform, and load data from multiple sources into GCP. SQL & Data Modeling Write efficient, complex SQL queries for data extraction, transformation, and reporting. Design and maintain normalized and dimensional data models (data marts, warehouse schemas) in BigQuery and SQL‑based engines.

Tune SQL queries and optimize data models for performance and cost on BigQuery. GCP Data Services Work with BigQuery for analytics workloads, including partitioning, clustering, and materialized views. Use Cloud SQL (MySQL/PostgreSQL) for transactional or operational data layers and integrations.

Leverage GCS, Dataflow, and Cloud Scheduler for orchestration and automation of data workflows. Data Quality & Governance Implement data quality checks, validation rules, and monitoring on data pipelines. Ensure data consistency, integrity, and lineage across sources, pipelines, and reporting layers.

Collaboration & Documentation Collaborate closely with Data Analysts, Product Managers, and Data Scientists to understand business requirements and translate them into data models and pipelines. Maintain clear documentation for data models, APIs, and ETL jobs for internal stakeholders. Required Skills & Qualifications Experience: 3+ years of experience as a Data Engineer or ETL Developer . 1–3 years of hands‑on experience with Google Cloud Platform (GCP) , especially BigQuery, Cloud Storage, Dataflow, and Cloud SQL .

Technical Skills: Strong proficiency in SQL (queries, joins, aggregations, CTEs, window functions). Experience with data modeling (star/snowflake schemas, facts/dimensions). Familiarity with Python or another scripting language for pipeline automation.

Basic understanding of distributed systems and cloud data architectures . Soft Skills: Good analytical and problem‑solving skills. Strong communication and collaboration skills with cross‑functional teams.

Posted Today

Related Jobs

Data Analyst

Trigent Software Private Limited

Pune Yesterday
Full-time

Related Searches

Apply Now