Data Engineer - Sydney
Barangaroo, New South Wales, AU
Role: Data Engineer
Location: Sydney
What did you have for breakfast today? Whether it’s the flour in your toast or the grain in your cereal, it’s highly likely that GrainCorp helped get it onto your plate! As we find new ways to connect rural communities with food, animal feed, and industrial customers around the world, we’re proud to be leading the way in sustainable agriculture.
What our team says:
“GrainCorp’s supportive culture and strong leadership have empowered my rapid growth, from Junior to Intermediate Data Engineer in just five months. The Advanced Analytics Team’s collaborative spirit and focus on continuous learning make it an inspiring place to build a rewarding career”
GrainCorp is currently seeking a Data Engineer to join a motivated and collaborative team delivering data, infrastructure, and system activities to support advanced analytic use cases. You’ll have the opportunity to work from multiple exciting AI and IoT projects and be trained by experts who’ve been in this space for decades.
The responsibilities of the role include:
- Transform design and product vision into working solutions for end users, fostering a culture of sharing, reuse, scale stability, and user-first design.
- Design, develop, optimize, and maintain data engineering solutions for analytical use cases including architecture, data products/pipelines, and infrastructure using industry best practices.
- Stay up to date with new techniques and apply evolving industry standards to individual projects.
- Support continuous team growth.
About your experience:
- You’ll bring experience in delivering data engineering solutions and collaborating across teams. You’ll also demonstrate:
- At least 5 years of hands-on experience in data engineering solution delivery, with strong problem-solving and communication skills.
- Experience in DataOps, InfraOps, DevOps, and optionally MLOps (Azure preferred), with proficiency in at least one programming language (Python preferred).
- Data warehouse design and implementation to support AI use cases and BI insights.
- Experience in Azure IoT and Gen-AI solution implementation.
- Bachelor’s degree in computer science, MIS, or Engineering preferred.
About your skills
Technical:
- ETL using PySpark, Python, SQL, etc., to create data products for AI/ML and BI consumers.
- Data warehouse / Lakehouse design and implementation.
- Delta Lake storage and Apache Spark engine performance optimization.
- Azure: Databricks (Data Engineering and Gen-AI), IaC (Bicep preferred), DevOps, Azure ML workspaces, vNet, etc.
- Insight reporting tools (Power BI preferred)
Non-technical:
- Strong problem-solving and communication skills.
- Collaborative team player with agile project delivery experience.
What we offer:
- Professional development & leadership programs
- Hybrid work and flexible leave options including birthday leave
- Health & wellbeing support
- Inclusive, values-driven culture
- We’re proud to be a Family Inclusive Workplace accredited employer, supporting balance, care and flexibility in every career
- Overall satisfaction in Advanced Analytics team employee survey for last two years are both more than 90%, most highlighted keywords from current employee survey about team includes “Recognition, supported and collaboration”
Ready to apply?
It’s simple — submit your application. If your background aligns, our team will be in touch for a quick chat about your experience. We’re looking forward to getting to know you!
Progressed candidates will be required to provide proof of working rights and suitable professional referees.