Data Integration Engineer
Contract or Temp
05.05.2025
1331668
Jump on this exciting role across a high impact programme of work!
Why You’d Like It
As part of their ongoing data transformation, this organisation is growing its team and looking for a Senior Data Engineer with strong ingestion and integration experience using Kafka. You’ll be focused on data pipeline development, while getting exposure to modern data tools and platforms. It’s a supportive environment where you can build your skills, work on big data projects, and take your career to the next level.
Company Profile
This business operates in a complex and fast-moving industry, where data plays a crucial role in decision-making and operational efficiency. Their technology team is highly valued, working across engineering, analytics, and architecture to deliver best-in-class data solutions.
Your Role
Your Fit
* Applicants must be legally entitled to work in New Zealand. If you are not a NZ Citizen, you must have the right of permanent residence or a work permit.
- Short-term contract with an immediate start until end of June.
- Work with modern technologies such as Azure, Databricks, Kafka and Spark.
- Work on meaningful projects with large datasets that make an impact.
- Auckland CBD based.
- 2 days WFH per week.
As part of their ongoing data transformation, this organisation is growing its team and looking for a Senior Data Engineer with strong ingestion and integration experience using Kafka. You’ll be focused on data pipeline development, while getting exposure to modern data tools and platforms. It’s a supportive environment where you can build your skills, work on big data projects, and take your career to the next level.
Company Profile
This business operates in a complex and fast-moving industry, where data plays a crucial role in decision-making and operational efficiency. Their technology team is highly valued, working across engineering, analytics, and architecture to deliver best-in-class data solutions.
Your Role
- Develop and manage outbound data pipelines to Confluent Kafka, ensuring seamless integration.
- Design scalable ingestion solutions that reduce pipeline maintenance through intelligent grouping strategies
- Load and transform data from the lake into Databricks, using Delta Live Tables in both streaming and micro-batch models.
- Design and deliver complex rate and charge calculations.
- Work closely with experienced engineers and analysts
Your Fit
- Strong SQL skills with the ability to implement complex business logic in data transformations.
- Hands-on experience with Confluent Kafka, SQL, Data Lake, ApacheSpark and Databricks.
- Understanding of near real-time data processing and event driven architectures.
- Familiarity with Delta Live Tables (DLT), including both streaming and micro-batch processing.
- Ability to deliver high-quality outcomes quickly and pragmatically in a fast-paced environment.
- Enjoy working with data and solving technical challenges
* Applicants must be legally entitled to work in New Zealand. If you are not a NZ Citizen, you must have the right of permanent residence or a work permit.
Like the sound of that?
If this sounds like you please reach out to me at
sofia@digitalgarage.co.nz
sofia@digitalgarage.co.nz