Search by job, company or skills

Rukita

Senior Data Engineer

5-7 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 6 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About Rukita:

Rukita is a property technology (proptech) company that provides long-stay rental housing solutions with end-to-end and hassle-free services in Indonesia. The company builds a holistic ecosystem of housing solutions for both tenants & landlords. It provides quality, accessible housing for the young generation while assisting landlords in transforming their assets into high-yield rental properties.

Key Responsibilities:

  • Develop and maintain scalable, cost-efficient data pipelines (ETL/ELT) from multiple sources (databases, APIs, web pages, and files)
  • Develop and maintain data warehouses and data lakes for structured and semi-structured data.
  • Build and maintain APIs and other data exchange interfaces.
  • Ensure high availability and observability for all critical data services (e.g. BI Tools, Reverse ETL, Orchestrator).
  • Maintain data-related repositories, work towards 0 silo, document clearly, and gracefully manage tech debts
  • Actively communicate to all relevant stakeholders; push and influence the prioritization of data-driven projects from Data Engineering perspectives

Requirements:

  • Bachelor's Degree in Computer Science, Engineering, or other relevant technical fields.
  • 5+ years of experience in data engineering, with a proven track record of building and maintaining complex data pipelines, integration layers, and data warehouses while managing the costs of running the data operations
  • Deep understanding of data modeling concepts and best practices
  • SQL and Python Mastery
  • Expertise in self-hosted, open-source modern data pipeline tech stack deployed in cloud platforms. Plus points if you have professional experience in:
  • Data pipeline: airflow, dbt, dask
  • Cloud platform: AWS, GCP
  • DWH and BI: bigquery, metabase
  • Experience in version control (git & CI/CD for automated deployments)
  • Experience in data streaming processes (e.g. Pub/Sub, Kafka)
  • Experience in observability tooling; setting up Alloy/OpenTelemetry, monitoring (e.g. Grafana), and optimizing performance
  • Active, comprehensive, and clear communication, both in spoken and written manners
  • Experience in MLOps is a plus
  • Experience in leveraging LLMs in improving a data team's process workflow is a plus

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 142482097

Similar Jobs