| Company | Deloitte |
| Entity | Deloitte Touche Tohmatsu India LLP |
| Job Title | Technology & Transformation | GCP Data Engineer |
| Designation | Senior Consultant |
| Job Requisition ID | 99853 |
| Location | Pune |
| Practice / Team | Technology & Transformation – Engineering / Data & Analytics |
| Role Objective | Design, build, and deploy scalable data engineering solutions using Google Cloud Platform and modern data technologies |
| Core Responsibilities | Design and develop data solutions, automate processes, maintain documentation, resolve operational issues, and support data pipeline implementation |
| Process Understanding | Analyze existing business processes and design automated workflows |
| Automation Development | Configure and manage automation workflows and robotics processes |
| Documentation & Testing | Maintain documentation and support UAT and production testing |
| Stakeholder Collaboration | Work with process owners, analysts, and business stakeholders to design data solutions |
| Primary Cloud Platform | Google Cloud Platform (GCP) |
| GCP Services Required | BigQuery, Cloud Storage, Dataflow, Cloud Dataproc, Cloud Composer (Airflow), IAM |
| Database Technologies | Bigtable, Spanner, Cloud SQL, AlloyDB |
| Programming Languages | SQL, Python, Java, Scala |
| Data Engineering Skills | Data modeling, ETL pipeline development, data warehousing concepts |
| Database Expertise | Relational databases such as MySQL, PostgreSQL, Oracle |
| Data Migration Skills | Design and implement database migration strategies across multiple platforms |
| CI/CD Tools | Git, Jenkins, SonarQube, Artifactory, Docker |
| Workflow Orchestration | Apache Airflow / Cloud Composer |
| Big Data Tools | Spark, Hadoop, Flink (preferred) |
| NoSQL Databases (Preferred) | MongoDB, Cassandra, DynamoDB, Scylla |
| Performance Optimization | Optimize data pipelines for performance and cost efficiency |
| Data Quality Management | Implement monitoring, validation, and integrity checks for data pipelines |
| Data Visualization Collaboration | Work with analysts and data scientists to deliver insight-driven data solutions |
| Additional Skills (Preferred) | Unix/Linux scripting, version control with Git, CI/CD for data pipelines |
| Work Model | Hybrid working environment |
| Travel Requirement | Occasional travel to client locations |
| Key Competencies | Analytical thinking, problem solving, stakeholder communication, and technical solution design |