About The Role:
We are seeking a detail-oriented and proactive Data Quality Engineer to join our team. The role is responsible for ensuring the accuracy, completeness, consistency, and reliability of data across various systems and platforms. The Data Quality Engineer will collaborate with data engineers, analytics, and business stakeholders to design and implement processes, tools, and frameworksparticularly leveraging dbt and other modern data quality toolsto improve and maintain high-quality data for decision-making and business operations.
Key Responsibilities:
- Design, develop, and maintain data quality frameworks and validation rules, with a strong focus on dbt testing and dbt documentation.
- Implement automated data quality checks within ETL/ELT pipelines (e.g., dbt tests, Great Expectations).
- Collaborate with data engineering and Analytics Team to integrate quality checks into data workflows.
- Define and monitor data quality metrics (DQMs), such as completeness, validity, uniqueness, and timeliness.
- Investigate and resolve data quality issues by identifying root causes and working with relevant teams.
- Document data quality processes, dbt models, and best practices.
- Partner with stakeholders to define data quality requirements for new projects and integrations.
- Support regulatory, compliance, and governance initiatives by maintaining high standards of data integrity.
About You:
- Bachelors degree in Computer Science, Information Systems, Data Engineering, or related field.
- Proven experience in data quality, data engineering, or data governance roles.
- Strong SQL skills and hands-on experience with databases, dbt, ETL/ELT tools, and data pipelines.
- Familiarity with cloud data platforms (e.g., BigQuery).
- Experience with dbt testing frameworks (e.g., schema tests, custom tests, data documentation).
- Knowledge of data quality frameworks, profiling tools, or data catalog solutions (e.g., Great Expectations, Informatica DQ, Collibra).
- Strong analytical and problem-solving skills with keen attention to detail.
- Good understanding of data governance, master data management, and regulatory compliance.
- Excellent communication skills and ability to work with cross-functional teams.
Nice to Have:
- Experience with Python for custom dbt tests or validation scripts.
- Exposure to streaming data pipelines (Kafka, Pub/Sub).
- Knowledge of Agile/Scrum methodologies.