Design, build, and maintain scalable and efficient ETL/ELT data pipelines using a variety of tools and technologies.
Manage and optimise our data warehouse, preferably with experience in Snowflake.
Collaborate with business intelligence professionals to ensure data is structured for optimal performance in reporting and analytics, with a strong preference for experience with tools like Sigma Computing or similar BI platforms.
Monitor data pipelines for performance and reliability, and troubleshoot issues as they arise.
Implement data governance and security policies to ensure data integrity and compliance.
Collaborate with cross-functional teams to identify data needs and develop solutions that meet business requirements.
Stay up-to-date with industry trends and best practices in data engineering to continuously improve our data systems.
Minimum Qualifications
Have 3+ years work experience (specialized in Data is preferred)
Have at least 3 years hands on experience with SQL
Proven to be skilled at ETL - SSIS / Talend / Kettle / Pentaho Debezium, Kafka, Flink, Spark Postgresql, Oracle Database, SQL, NoSQL, Gsql dan TG, Python, MongoDB
Have knowledge of ETL, CDC, Event Based Streaming, GraphDB, REST API, Phyton, Basic Programming
Eager & willing to learn new tech
Available ASAP and WFO
Have experience in insurance industry will be a plus point