
Search by job, company or skills
Meridia is an AgTech company specialised in field data within complex smallholder-heavy agri-commodity supply chains such as cocoa, coffee, palm oil, rubber and soy.
Meridia Verify®, its flagship SaaS product, verifies supply chain field data in real time - and validates compliance with frameworks such as the EUDR (EU Deforestation Regulation). With Verify, companies can reduce commercial and reputational risk, minimise supply chain disruption and accelerate evidence-based decision-making for strategic sourcing, procurement and trade.
Meridia's mission is to improve the veracity of datasets driving decision making in global agri-commodity supply chains. Its purpose is to build transparent and inclusive supply chains that afford smallholders a level playing field.
See for more information on our work.
Job DescriptionAs a Data Engineer, you will serve as a data and analytics expert by modelling, structuring, and scripting data flows. You will be working on structuring data in an automated way for it to be processed into meaningful outputs.
You will be based in Indonesia and will work closely with our teams in The Netherlands, Lithuania and Ghana, as well as with colleagues across Indonesia. You will report to the Data Manager in Indonesia.
ResponsibilitiesThe Data Engineer has the following areas of work and responsibilities:
Data Engineering
Core Data Engineering Skills
The ideal candidate should be eager and willing to learn and grow in the role. You will be engaged in on-the-job training and development to learn about the land rights sector and our data systems and workflows. You will be mentored by our Data Engineering team in Indonesia and senior colleagues in the Netherlands.
RequirementsThe ideal candidate has/is:
The following are an advantage:
The benefits package includes
Applications will be processed on a rolling basis. Please apply by submitting your details on Workable:
Only applications meeting the qualifications and submitted through Workable will be reviewed.
Our preferred starting date is 1st June 2026.
If you have any questions, feel free to reach out to [Confidential Information] for more information about this position.
Job ID: 147276863
Skills:
composer , BigQuery, Data Warehousing, Sql, Nosql, Gcp, RDBMS, Python, Infrastructure as Code, Cloud Functions, Pub Sub, Cloud Run, data quality tools, dbt
Skills:
Apache Airflow, Data Security, SSIS, Talend, Python, Sql, ELT, Etl, Data Processing, data lake architecture, Pentaho
Skills:
data vault , snowflake , Data Modeling, Pyspark, Kafka, Dimensional Modeling, ELT, Kinesis, Star Schema, Python, AWS, Graphql, Api Development, Sql, REST, Gcp, Databricks, Azure, Etl, Airflow, Pub Sub, Step Functions, 3NF, dbt, Webhooks
Skills:
data warehouses , snowflake , BigQuery, Kafka, Data Modeling, Sql, ELT, Typescript, Gcp, Python, Etl, feature stores, data pipelines, streaming systems, cloud platforms, ML data pipelines, Pub Sub, analytics engineering, data lake technologies, dbt
Skills:
data engineering , Advanced Analytics, Data Security, Power Bi, Data Quality, Data Science, Tableau, Data Governance, SSIS, ELT, Nosql, Gcp, RDBMS, Azure, Talend, Etl, AWS, Big data concepts, Looker, SQL performance optimization, Pentaho
We don’t charge any money for job offers