Data Engineers

Multiple Operations Data Engineers are required to support and maintain enterprise data assets across a modern cloud-based data platform. These roles play a key part in ensuring the reliability, performance, and operational stability of data services, supporting both business-as-usual (BAU) delivery and critical processing cycles.

The positions focus on operational support of data pipelines, month-end processing, and responding to data-related requests, while ensuring service level agreements (SLAs) are consistently met across the platform.


Key Responsibilities
  • Support and maintain data assets across cloud data lake, cloud EDW, legacy EDW, and analytics platforms
  • Provide Level 2/3 technical support across AWS, Control-M, Snowflake, Teradata (legacy), and ETL-based integration solutions
  • Support daily data delivery processes and month-end operational workloads
  • Develop and maintain SQL-based analytical and ETL code
  • Monitor and manage scheduling workflows using Control-M or similar tools
  • Work with file transfer solutions such as GoAnywhere or equivalent
  • Apply DevOps practices and tools in line with established delivery processes
  • Maintain system and technical documentation
  • Analyse, size, prioritise, and queue work items
  • Collaborate with data consumers, database developers, testers, and IT support teams
  • Drive continuous improvement in coding standards, delivery practices, and documentation, including knowledge sharing and training
Essential Skills & Experience
  • Strong experience with AWS data engineering services including Glue, S3, Lambda, EC2, and RDS
  • Proven experience designing and developing data pipelines using ETL/ELT frameworks
  • Hands-on experience with Snowflake (cloud EDW) and/or Teradata (on-prem EDW)
  • High level of programming capability, particularly in Python
  • Experience with Control-M orchestration, monitoring, or similar scheduling tools
  • Strong background in operational support and BAU environments
Desirable Experience
  • Infrastructure-as-code exposure using CloudFormation or Terraform
  • Experience with ETL tools such as DBT, Talend, Informatica, or similar
  • Advanced SQL capability
  • Exposure to SAS platforms (Base, Enterprise Guide, or SAS Viya)

This role will be offered as an initial 12 month contract + opportunity to extend.

Multiple roles are available in Sydney and Melbourne and are due to start early February 2026.

To apply please send your resume or contact Nat on 0430 292 875 or natalie@blackroc.co
JobAdder.com