Melbourne AI Engineering & Infrastructure Summit 2025
Shape the future of AI and join industry leaders for hands-on sessions and insights on scalable AI systems and high-performance infrastructure.

Join us at the AI Engineering and Infrastructure Summit to shape the future of AI systems.
In June, we're bringing together AI engineers, data scientists, and technology leaders to explore scalable AI systems and high-performance infrastructure.
Discover best practices for deploying AI models at scale, optimising data pipelines for machine learning workloads, and implementing continuous integration and deployment. Dive into Edge AI, discuss ethics in AI engineering, and debate whether cloud or on-prem solutions are best for AI development. Engage in interactive sessions, real-world case studies, panel discussions, and debates to stay ahead of emerging trends in AI engineering.
Key Themes:
- Building Scalable AI Systems
- Leveraging AI modernisation to transform applications and systems
- High-Performance AI Infrastructure
- Deploying AI Models at Scale
- Optimising Data Pipelines for ML Workloads
- Implementing Continuous Integration and Deployment
- Edge AI
- Ethics in AI Engineering
- Cloud vs. On-Prem: What Is Best for AI Development
Who Should Attend?
AI engineers, data scientists, IT professionals, technology leaders, and anyone eager to enhance their understanding of AI engineering and infrastructure.
Don't miss this chance for a day of learning, innovation, and collaboration.
Program Highlights
Speakers
Sessions
AI, Engineering & Infrastructure Leaders
Track
Our Speakers
Agenda
In the rush to implement Large Language Models, many organizations are overlooking the strategic value of traditional machine learning approaches. Through real-world examples and practical frameworks, this talk challenges the "LLM everywhere" mindset and demonstrates how a hybrid approach combining targeted traditional ML with LLMs can create more reliable, cost-effective, and safer AI systems.
How do you architect an AI-native platform purpose-built for vector search, LLM workflows, and scale? In this technical session, Relevance.ai Co-Founder Daniel Palmer shares the foundational decisions and cutting-edge infrastructure behind the platform's rapid growth. Gain insights into the real engineering behind enabling powerful, scalable, and enterprise-ready AI capabilities.
- Designing for Scale from Day One: How Relevance.ai architected a vector-first platform to support fast, scalable AI-driven use cases
- Compute and Cost Efficiency at Scale: Balancing performance, latency, and cost using smart infrastructure strategies across multi-cloud environments
- LLM Workflow Orchestration: Powering custom enterprise AI workflows with modular, flexible pipelines and real-time data processing
- From MVP to Enterprise-Ready: Evolving the platform to meet enterprise security, reliability, and integration requirements without sacrificing speed
This panel discussion dives into the compelling reasons to leverage AI when modernising tech estates and businesses, outlines sophisticated implementation approaches, and identifies the essential stakeholders who can champion a truly transformative agenda.
- Examining the drivers prompting organisations to transformation applications and systems
- Highlighting key opportunities including using AI to help modernise, and as a part of the modernised state
- Understanding cross-functional roles necessary to orchestrate AI enabled modernisation successfully
- Outlining advanced strategies for refining infrastructure, data pipelines, and MLOps processes to scale AI effectively
- Fostering collaboration, robust governance, and continuous learning to ensure AI’s long-term viability and impact
In this innovative session, attendees will be faced with a series of scenarios that they may face in their roles. Attendees will discuss the possible courses of action with their peers to consider the ramifications of each option before logging their own course of action.
Results will be tallied and analysed by our session facilitator and results will impact the way the group moves through the activity.
Will we collectively choose the right course of action?
As one of Australia’s leading employment marketplaces, SEEK leverages AI to match job seekers with the right opportunities and assist employers in finding top talent. This session unveils how SEEK integrates responsible AI principles into its suite of products and programs—ensuring transparency, fairness, and trust across its extensive ecosystem of users and partners.
At the intersection of compliance, technology, and organizational culture, sendpayments.com has achieved a 40% improvement in engineering velocity by treating AI agents like human staff. Operating in a highly regulated industry, their strategy involves thoroughly vetting AI through policy and governance committees before any workflow implementation.
- Defining, Onboarding, and Reviewing AI Roles: Creating AI “job descriptions,” profiling responsibilities, and conducting performance evaluations as if AI were full-fledged team members
- Putting Tasks Before Technology: Ensuring AI augments rather than dictates workflows by focusing first on business requirements and operational needs
- Institutionalising Governance: Feeding every AI decision through policy and governance committees to uphold trust, transparency, and regulatory alignment
Explore the technical intricacies of designing, deploying, and scaling AI infrastructure. Delve into the tools, frameworks, and architectures that power high-performance AI solutions, and learn how to balance agility, security, and cost-efficiency.
- How do teams architect resilient, high-performance computing environments to support AI workloads at scale?
- How can teams ensure real-time, high-volume data flow for AI?
- Which pipelines streamline model development, deployment, and continuous monitoring?
Roundtable topics to be shared with registered attendees for their selection
In this interactive session, participants will explore and debate five hot-button issues shaping AI’s future. Expect divergent views and lively discussion on how these trends could redefine both engineering practices and business outcomes.
- Cloud vs. On-Prem High-Performance Computing – Balancing elasticity, control, and cost
- AutoML Tools – Do they empower teams or oversimplify complex engineering challenges?
- Ethical AI vs. Speed to Market – Should organisations slow innovation to ensure responsible development?
- Edge AI vs. Centralised Processing – Is pushing more AI to the edge truly efficient or overly complex?
- Low-Code/No-Code AI – Does democratising AI risk quality and governance, or is it the key to widespread adoption?
Who Attends?
Head of Machine Learning
Head of AI
Head of Engineering
Head of AI Engineering
Head of Data
Digital Transformation Director
Head of DevOps
Application Development Director
Software Architect
Cloud Architecture Manager
Site Reliability Engineering Manager
Head of Platform
Benefits For Attendees




Event Location
Metropolis Events

FAQs
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique.
Get In Touch
Contact our event team for any enquiry

Danny Perry
For sponsorship opportunities.

Lili Munar
For guest and attendee enquiries.

Ben Turner
For speaking opportunities & content enquiries.

Taylor Stanyon
For event-related enquiries.