Job Description:
To architect and maintain the organization's Integration and Analytical data infrastructure. This role ensures high-quality data flow from heterogeneous sources into a unified platform while establishing the technical standards and best practices for data modeling. The role empowers data team members by providing a high-performance, governed environment (OLAP engines, Data Catalog, and Quality Frameworks) and ensuring technical data contracts are enforced.
Requirements
- Bachelor's degree in Computer Science, Engineering, or a related technical field.
- Typically 5+ years of relevant experience in Data Engineering, with a proven track record of architecting integration layers and managing complex data warehouses.
- Deep understanding of Distributed Systems, OLAP optimization, Data Governance frameworks, and Data Modeling best practices.
- Technical Competencies
- Expert SQL & Python
- Modern Warehouse/OLAP Expertise
- Data Orchestration (Experience with Airflow or similar tools).
- Technical Governance Tools (Experience with Open Metadata, DataHub, or Amundsen).
- Data Contracts & Lineage (Knowing how to define schemas and track data flow).
- Software Engineering Basics (Docker, Git, and CI/CD for automating deployments).
- Integration Methods (Change Data Capture/CDC, Kafka, and API integration).
Only shortlisted candidates will be contacted