Design, build, and maintain data pipelines to support analytics, reporting, and operational needs using modern data engineering tools and frameworks.
Develop and maintain ETL/ELT processes that clean, transform, and enrich raw data into structured and usable forms.
Work with cloud platforms such as AWS, Azure, or GCP to support data storage, compute, orchestration, and deployment.
Support the development of data warehouses, data lakes, and BI platforms optimized for analytics, reporting and insight delivery.
Collaborate with data analysts, cross-functional teams and external clients to deliver integrated and efficient data solutions.
Optimize data infrastructure for scalability, performance, and cost efficiency.
Implement and follow data governance standards, including data quality checks, documentation, monitoring, and security best practices.
Minimum Qualifications
Bachelor's degree in Computer Science, Informatics, Information Systems, Statistics, Engineering, or a related field.
23 years of experience in data engineering or related roles, with hands-on experience using cloud platforms, preferably Google Cloud Platform (GCP); experience with other cloud providers is also acceptable.
Strong understanding of SQL and data modeling concepts.
Proficiency in programming languages (e.g., Python, Java, or Scala).
Technical Skills : Strong SQL and data transformation fundamentals ETL/ELT development Cloud Data Platforms (GCP(preferable)/AWS/Azure) Strong understanding of data warehouse & data lake concepts API integration & programming
Leadership Skills : Teamwork, communication & cross-functional collaboration Analytical thinking & problem-solving Positive Attitude & self-motivated Attention to detail