Job Requirement:
- Bachelors degree in Computer Science, Information Systems, Mathematics, Engineering, or related field (or equivalent experience).
- 1+ years of experience working as a Data Engineer or in a similar role.
- Strong proficiency with SQL (optimization, complex queries, stored procedures).
- Experience with ETL/ELT workflows preferably in Apache Airflow
- Solid experience in Python
- Understanding of data modeling, data warehousing, and schema design.
- Familiarity with Redshift (maintenance tasks like vacuum, query optimization, pg_stat analysis).
- Knowledge of Docker and containerized deployments.
- Comfortable working in Linux/Ubuntu environments.
Job Description:
- Create and develop data ingestion from various sources: RDBMS, Rest API, Kafka, Text File, and Spreadsheet
- Collaborate with Analytics & Business Intelligence team in order to design scalable implementations of their models
- Partner with teammates to create complex data processing pipelines in order to solve our users
- Design, develop, optimize, and maintain data architecture and pipelines
- Ability to quickly resolve performance and systems incidents.
- Drive the prioritization, strategy, and focus on solving user problems.
Note:
Working arrangement:
- Contract based 12 months under outsource
- WFO at Kedoya Jakbar
You can apply here or send your application to [Confidential Information] with subject: Apply_Job Position_Your Nam