Job Summary
Responsible for developing, maintaining, and optimizing data pipelines and infrastructure to support accurate reporting, analytics, and data-driven decision-making across the organization.
Job Description
- Design, build, and maintain reliable data pipelines and ETL/ELT processes
- Ensure data accuracy, consistency, and availability across systems
- Support data integration from multiple internal and external sources
- Monitor and optimize data performance, quality, and efficiency
- Collaborate with analysts, developers, and stakeholders on data needs
- Troubleshoot data-related issues and provide technical solutions
- Maintain documentation for data models, processes, and workflows
Job Requirements
- Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field
- Minimum 12 years experience in data engineering or similar role
- Strong knowledge of SQL and data warehouse concepts
- Experience with ETL tools and data pipeline development
- Familiar with Python, R, or similar programming languages
- Understanding of cloud data platforms and databases
- Strong analytical, problem-solving, and communication skills
- Detail-oriented and able to work in a fast-paced environment