
Search by job, company or skills
Key Responsibilities
Design, build, and maintain data pipelines to support analytics, reporting, and
operational use cases using Microsoft Fabric or similar platforms (e.g., Azure
Databricks, Snowflake, BigQuery, Redshift).
Develop and optimize ETL/ELT workflows across multiple source systems,
ensuring reliable and high performance data integration.
Write and maintain notebooks (Python, SQL) with clean, efficient, and well
structured code.
Ensure data accuracy, integrity, and security of data through validation,
monitoring, and troubleshooting.
Collaborate with stakeholders to understand data requirements and translate
them into scalable data solutions.
Contribute to the continuous improvement of data engineering standards, data
quality checks, technical documentation, and best practices.
Requirements
Bachelor's degree (S1) in Computer Science or Information System, or related field.
1-3 years of hands-on experience in Data Engineering, Data Warehouse, Business Intelligence, Analytics, or related technical field.
Strong proficiency in SQL (must-have); experience writing complex queries and optimizing performance.
Solid understanding of ETL/ELT processes, data modeling, and data warehouse concepts.
Good knowledge in Python or equivalent programming language.
Strong communication skills in English, both written and verbal.
Familiarity with modern data platforms such as Microsoft Fabric, Azure Databricks, Snowflake, BigQuery, or Redshift is highly preferred.
Exposure to BI tools such as Power BI or Tableau is a plus.
Job ID: 144196869