Back to Insights
Strategy8 min read

The Australian Government Just Set the AI Infrastructure Bar. Every Business Needs to Read This.

On 23 March 2026, the Australian Government released its 'Expectations of data centres and AI infrastructure developers' — a direct policy signal about what responsible AI infrastructure looks like. With the Privacy Act deadline eight months away and the OAIC already checking, this document changes the risk calculus for every Australian business choosing where their AI runs.

Kishore Reddy Pagidi
Kishore Reddy Pagidi

AI PM at SOLIDWORKS. Founder, Akira Data.

*Published 28 March 2026.*

On 23 March 2026 — five days ago — the Australian Government released a document that has not received the attention it deserves from Australian mid-market businesses. Titled "Expectations of data centres and AI infrastructure developers," it is a direct policy statement from the federal government about what responsible AI infrastructure looks like in Australia.

This document matters because it signals, with unusual clarity, the direction of Australian AI governance. When a government publishes "expectations" for infrastructure developers, it is telling the market where compliance is heading. Businesses that read these signals early can build infrastructure that stays ahead of regulation. Businesses that miss them find themselves retrofitting.

The release follows Australia's National AI Plan published in late 2025 and arrives eight months before the Privacy Act's mandatory automated decision-making transparency obligations take effect on 10 December 2026. The timing is deliberate: the government is establishing the infrastructure standards that will underpin enforceable compliance requirements.

Here is what the document says, what it means for your business, and the practical steps to take in the next 30 days.

What the Government's Expectations Actually Cover

The document addresses data centres and AI infrastructure developers, but its implications flow through to every business that uses cloud AI services — which, in 2026, is most businesses.

The core expectations fall into three categories.

Sovereignty and Data Residency

The government expects AI infrastructure processing sensitive Australian data to maintain Australian data residency and to be able to demonstrate that residency. This is not new territory — Australian Privacy Principle 8 has always governed cross-border data transfers — but the government's explicit "expectation" language signals that it expects regulated entities and government-adjacent businesses to be operating on Australian-jurisdiction AI infrastructure as the default, not the exception.

The practical implication: if your AI systems are calling default API endpoints — api.openai.com routes through US infrastructure, most Anthropic endpoints route through US infrastructure, many Google AI endpoints default to US — you are not meeting the spirit of these expectations when processing personal information about Australians.

The fix is infrastructure configuration, not switching providers. Every major AI provider has Australian-region options: AWS Bedrock in Sydney (ap-southeast-2), Azure OpenAI in Australia East, Google Vertex AI in australia-southeast1. Configure explicitly; do not rely on defaults.

Security Standards Alignment

The expectations align with the Australian Signals Directorate's Essential Eight framework and the PSPF (Protective Security Policy Framework) for government-adjacent deployments. For private sector businesses, the relevant signal is that AI infrastructure security is now being evaluated against the same frameworks as other critical technology infrastructure.

For APRA-regulated entities, this reinforces CPS 234's requirements applied specifically to AI infrastructure: your AI model providers are third-party technology service providers and should be assessed as such under your vendor risk management framework.

For non-regulated businesses, this signals the direction of industry standards. Businesses that implement Essential Eight-aligned AI infrastructure now will not need to retrofit when these standards become contractual requirements for government and enterprise clients.

Environmental and Operational Accountability

The expectations include energy efficiency, water usage, and operational resilience requirements for data centre operators. For businesses choosing AI infrastructure, this creates a due diligence question: are your AI providers operating infrastructure that meets these standards?

Practically: cloud providers with Australian-region infrastructure (AWS, Microsoft, Google) are subject to local operating requirements and can provide documentation. Offshore providers with no Australian presence cannot demonstrate compliance with local standards and increasingly cannot be the default choice for sensitive Australian workloads.

Why This Matters More Than Most Businesses Realise

Australian government policy documents are typically read by technology companies and government agencies. Mid-market businesses in financial services, healthcare, professional services, and mining tend to read them when they become regulations — by which time the remediation window has narrowed.

The March 23 expectations document is a pre-regulation signal. It is the government saying: here is the standard we expect, here is the direction of travel, here is what we will be holding infrastructure providers and their customers to as regulations tighten.

The OAIC Is Already Checking

The Office of the Australian Information Commissioner launched its first proactive compliance sweep in January 2026 — 60 organisations across six sectors. The sweep is explicitly checking how organisations handle personal data in AI systems. The government's infrastructure expectations are the policy backdrop to that enforcement activity.

When the OAIC checks how your AI systems handle Australian personal data, the question of where that data is processed and under what legal jurisdiction will be relevant. "We used the default API endpoint" is not an adequate answer when the question is cross-border data transfer compliance.

The December 2026 Deadline Is Eight Months Away

The Privacy Act's mandatory automated decision-making transparency obligations take effect on 10 December 2026. Every APP entity using AI to make decisions affecting individuals must have audit trails, explanation capability, and privacy policy disclosures in place.

The government's infrastructure expectations set the baseline for how that compliance infrastructure should be built and operated. Organisations building AI compliance infrastructure now — audit trails on Australian-jurisdiction infrastructure, explainability systems running locally rather than calling offshore APIs — are building to the standard the government is signalling.

The Supply Chain Question

The expectations apply directly to "data centres and AI infrastructure developers." But the document creates a due diligence expectation for the businesses that use them: are your AI infrastructure choices compliant with Australian government expectations?

If you are providing services to government agencies, defence contractors, healthcare providers, or regulated financial services businesses, your AI infrastructure will increasingly be subject to supply chain risk assessments that ask exactly this question. Building Australian-jurisdiction AI infrastructure now is both compliance preparation and competitive differentiation.

The Three Infrastructure Decisions Every Australian Business Should Make Now

Decision 1: Audit Your AI API Endpoint Configuration

Map every AI tool your business uses. For each, identify where the data is processed:

  • Azure OpenAI: is your deployment in Australia East? Or using a default US endpoint?
  • AWS Bedrock: is your model deployment in ap-southeast-2 (Sydney)? Or us-east-1?
  • Google Vertex AI: is your endpoint in australia-southeast1? Or a default US region?
  • OpenAI, Anthropic, Cohere direct APIs: where is your data processed? What data residency commitments has the provider made?

For most businesses that have not explicitly configured data residency, the honest answer is "US infrastructure." That is the gap to close.

Decision 2: Classify Your AI Workloads by Data Sensitivity

Not all AI workloads require the same data residency controls. A classification framework:

Australian jurisdiction required:

  • Any workload processing personal information about Australian individuals (Privacy Act APP entities)
  • APRA-regulated entity workloads involving customer financial data
  • Healthcare workloads involving patient health information
  • Government-adjacent workloads involving Commonwealth data

Australian jurisdiction strongly preferred:

  • Professional services workloads involving confidential client data
  • Business data that could be subject to Australian legal proceedings or regulatory requests
  • Employee data and HR-related AI systems

Offshore processing acceptable with appropriate safeguards:

  • AI workloads on fully anonymised or aggregated non-personal data
  • Internal productivity tools processing no personal or confidential data
  • Prototyping and non-production systems with synthetic data only

For categories 1 and 2, the government's March 23 expectations and the Privacy Act compliance framework both point to Australian-jurisdiction processing as the standard.

Decision 3: Review Your AI Vendor Agreements

The government's expectations create a due diligence standard for vendor selection. For every AI provider that processes Australian personal data on your behalf, you should be able to confirm:

  • Data residency: personal data is processed and stored within Australian jurisdiction
  • Security standards: the provider meets ASD Essential Eight or equivalent
  • Breach notification: the provider will notify you of security incidents within timeframes that allow you to meet your Privacy Act notifiable data breach obligations
  • Audit rights: you can request evidence of compliance with their stated data handling commitments

Most businesses currently using cloud AI APIs do not have these provisions in their vendor agreements. Updating them is a procurement and legal task — it does not require rebuilding infrastructure.

The Market Opportunity in This Signal

The Australian businesses that move first on government-aligned AI infrastructure standards will have a competitive advantage in two markets.

Government contracting. The Australian Government is the largest technology procurement market in the country. The March 23 expectations will translate into supply chain requirements for AI services used in government contracts. Businesses with documented Australian-jurisdiction AI infrastructure and Essential Eight-aligned security practices will be positioned to pass these assessments. Businesses running default offshore AI APIs will not.

Enterprise clients in regulated sectors. Banks, insurers, healthcare providers, and professional services firms are subject to their own regulatory requirements around technology supply chains. As those requirements extend to AI infrastructure (APRA's CPS 234, healthcare privacy regulations, legal professional conduct rules), their suppliers and technology partners will face the same expectations. Demonstrating compliant AI infrastructure is a sales differentiator in these markets.

What Akira Data Does Here

Every AI system Akira Data builds for Australian businesses runs on Australian-jurisdiction infrastructure by default — AWS Sydney, Azure Australia East, or Google Cloud Sydney depending on the use case. This is not an optional extra; it is the architecture standard.

The AI Readiness Sprint (AUD $7,500, 2 weeks) includes an infrastructure audit as a core deliverable — mapping current AI tool deployments against data residency requirements and identifying the configuration gaps to close. For businesses that have received or anticipate receiving a government or enterprise client supply chain assessment on AI infrastructure, this audit provides the documented evidence of compliance posture.

The Privacy-Safe AI Implementation (from AUD $20,000) delivers the complete compliance architecture: Australian-jurisdiction processing, audit trail infrastructure, explanation capability for the December 2026 Privacy Act deadline, and vendor agreement review for third-party AI providers.

The March 23 expectations document is not the last word on Australian AI infrastructure standards. It is the opening statement. The businesses that read it now and act on it will be compliant when the next statement arrives. The businesses that miss it will be in the retrofit queue.


*Akira Data builds Privacy Act-compliant, Australian-jurisdiction AI systems for mid-market businesses. All data processed within Australia by default. [Start with an AI Readiness Sprint →](/contact)*

*This article references the Australian Government's "Expectations of data centres and AI infrastructure developers" (published 23 March 2026), Australia's National AI Plan (late 2025), the Privacy and Other Legislation Amendment Act 2024, and the OAIC's January 2026 proactive compliance sweep. This article is general information only and does not constitute legal advice.*

Share this article

Related Articles

Continue exploring these topics