3 days in the office | 2 days from home
- Work closely with business stakeholders to build scalable batch ELT pipelines on Azure Data Factory for analytics.
- Build and enhance Power BI dashboards.
- Swiftly diagnose and fix data-pipeline incidents to restore reliable, timely data.
About Us
G'day Group: Innovating tourism across 300+ locations. Diverse roles in tech, finance, marketing & more. Dynamic, flexible environment promoting growth and wellbeing. Shape the future of travel! Elevate your career here.
Our Benefits
- Health & Wellbeing: Flexible work, EAP, discounted health cover
- Generous Leave: Parental, volunteer, study leave, plus option to purchase more.
- Development Focus: Leadership programs, training support, professional memberships.
- Employee Savings: Discounts on accommodation, experiences, salary packaging.
- ESG Commitment: Fostering a safe workplace, community support, and environmental protection through a five-year strategy.
Your New Role
This role offers the opportunity to strengthen your data engineering expertise while expanding into analytics. You’ll build and operate robust batch data pipelines and Lakehouse solutions, and work alongside stakeholders to understand business needs and develop Power BI dashboards that turn well-engineered data into practical insights.
- Design, build and operate reliable batch data pipelines and lakehouse solutions using Azure Data Factory and Microsoft Fabric, applying reusable and parameterised patterns for ingestion and incremental loads.
- Develop and optimise data transformations using SQL and Python, working with Synapse/Fabric notebooks
- Contribute to monitoring, troubleshooting and continuous improvement of data pipelines, including incident resolution and prevention.
- Work closely with analytics and business stakeholders to develop your capability in building Power BI dashboards, helping turn trusted data into meaningful insights.
- Learn and apply automated testing approaches for data pipelines and transformations, supporting more reliable and maintainable data solutions.
- Collaborate within Agile delivery teams, contribute to documentation and support operational handover.
Your Skills and Experience
- Experience building or supporting batch data pipelines preferably using Azure data platforms
- Solid SQL skills and working knowledge of Python, with an interest in developing Spark-based data processing skills.
- Familiarity with Lakehouse or data warehouse concepts, including structured data modelling for analytics.
- An interest in dashboard development, with Power BI experience or a strong desire to build it.
- A collaborative mindset, curiosity to learn, and the ability to communicate effectively with both technical and non-technical stakeholders.
Join G'day Group: Disrupt, innovate, and ignite your career!
Police check & driver's licence required for this role.
- Published on 08 Jan 2026, 12:55 AM
