InsightMarch 2026·14 min read

Indigenous Australians, Digital Inclusion, and AI: Where We Stand in 2026

Aboriginal and Australian flags flying above the Sydney Opera House. Photo via Pexels.

One in five Australians is digitally excluded. For First Nations people, it is closer to one in 2.5. That is not a historical statistic. It is the state of things right now, in 2026, according to the Australian Digital Inclusion Index 2025, published by RMIT, Swinburne University, and Telstra.

As AI becomes embedded in how businesses operate, how government services are delivered, and how people access information, the digital divide risks becoming an AI divide. Communities that cannot access the internet reliably will not benefit from AI. Worse, AI systems trained without Indigenous perspectives risk making decisions about First Nations people without understanding or consent.

This article looks at the data on where things stand, the specific risks AI poses for Indigenous communities, the projects that are getting it right, and what businesses deploying AI in Australia should be aware of.

The digital inclusion gap in numbers

The Australian Digital Inclusion Index measures three dimensions: access (affordability and connectivity), ability (digital skills), and attitudes (confidence and motivation). It tracks these across demographics, geography, and socioeconomic status.

The 2025 data paints a clear picture:

MeasureFirst NationsAll Australians
Digitally excluded40.9%20.6%
Digital inclusion gap (national)10.5 pts below avgBaseline
Gap in remote areas16.5 pointsN/A
Gap in very remote areas22.8 pointsN/A
Daily internet use (remote, 2024)62%95%
Not online at all (remote, 2024)14%2%

Sources: Australian Digital Inclusion Index 2025; RMIT, December 2025

There is some progress. Between 2022 and 2024, daily internet use in remote First Nations communities rose from 44% to 62%. Digital ability scores in very remote areas improved by 8.7 points (from 45.8 to 54.5). The Australian Government announced free public Wi-Fi in 53 remote communities.

But the Closing the Gap Target 17, which aimed for equal digital inclusion by 2026, is widely considered unlikely to be met. A 10.5-point national gap does not close in a year. And as the Australian Academy of Humanities pointed out, there is little systematic data collection to even track progress properly.

The human cost of staying connected

Digital inclusion is not just about whether someone has a phone or laptop. It is about whether they can afford to stay connected without sacrificing other essentials.

According to research published by ACS Information Age, 53% of First Nations people have gone without food or skipped paying bills in order to maintain their internet or phone connection. That is not a connectivity problem. That is a poverty problem intersecting with a digital dependency problem.

When government services increasingly move online, when job applications require internet access, when telehealth replaces in-person consultations, staying connected is not optional. It is a prerequisite for participation in modern life. And for more than half of First Nations Australians, that participation comes at the cost of basic needs.

How AI risks making things worse

The Lowitja Institute's December 2025 report on the future impact of AI on First Nations communities identified three categories of risk:

  • Digital exclusion compounded by AI. If you cannot reliably access the internet, you cannot access AI-powered services. As government services, healthcare, education, and employment increasingly integrate AI, communities without digital access fall further behind.
  • Algorithmic bias. AI systems trained on data that underrepresents or misrepresents First Nations people will produce biased outputs. This affects everything from credit decisions to hiring algorithms to healthcare triage systems. If the training data does not include your community, the system does not work for your community.
  • Data colonialism. AI models are trained on vast datasets scraped from the internet. This includes Indigenous cultural knowledge, languages, stories, and imagery, often without the knowledge or consent of the communities those materials belong to. The Lowitja Institute describes this as a continuation of colonial extraction in digital form.

These risks are not hypothetical. They are happening now.

AI-generated imagery and cultural flattening

A February 2026 study from Murdoch University, published in The Conversation, examined what happens when you ask AI image generators to create pictures of "Indigenous Australians."

The results were damaging. Adobe's platform hosted AI-generated images labelled as "Indigenous Australian" that bore no resemblance to real people. Midjourney produced images of "Indigenous Australians" that more closely resembled stereotypical depictions of African tribespeople. The AI had no understanding of the fact that there are more than 250 distinct Indigenous language groups in Australia, each with their own cultural identity, practices, and connection to specific lands.

The researchers used the term "cultural flattening" to describe what happens when AI reduces the extraordinary diversity of Indigenous cultures into a single, generic, and often inaccurate visual. They also called it a new form of "technological colonialism," where the tools of the dominant culture are used to overwrite and misrepresent the cultures they were never designed to understand.

For businesses using AI to generate marketing imagery, this is worth understanding. If your AI tool cannot distinguish between 250+ distinct cultural groups, it should not be generating images that claim to represent any of them.

What is working: AI on community terms

The Lowitja Institute report found that the most promising AI projects involving First Nations communities are those "taken up on community terms." The common thread is governance: the community decides what data is collected, how it is used, and who benefits from it.

Mamutjitji Story App

Developed through Deakin University's Abundant Intelligence project, this app uses AI for language and cultural preservation. The critical difference from mainstream AI applications: community governance determines what is recorded, how it is stored, and who can access it. The AI serves the community's goals, not the other way around.

Kimberley Closed-System AI

Research from the University of Western Australia, published in The Conversation in March 2026, describes a project in the Kimberley region where AI is used within a closed system governed entirely by the community. The data never leaves community control. The researchers note that general-purpose AI like ChatGPT has "no understanding of cultural protocols or data sovereignty," but that Indigenous cultures, with their deep oral knowledge traditions, may actually be "natural users" of AI when the system is designed on their terms.

Both projects share a principle: the community is not the subject of the AI. The community is the owner. This is the core of Indigenous data sovereignty, and it is the standard against which any AI project involving First Nations communities should be measured.

What the research recommends

Across the sources reviewed for this article, several recommendations appear consistently:

  • Mandate Indigenous data sovereignty in AI regulation. Any AI system that processes data about or from First Nations communities should require free, prior, and informed consent. This should be built into Australia's AI governance framework, not bolted on as an afterthought.
  • Fund First Nations-led AI hubs. The Lowitja Institute specifically recommends funding Indigenous-led AI research and development centres. Communities need the capacity to build and govern their own AI tools, not just be consulted about tools built by others.
  • Close the access gap before layering AI on top. AI cannot help communities that cannot get online. Infrastructure investment in remote connectivity, affordable devices, and digital literacy remains the foundation.
  • Audit AI systems for bias against Indigenous Australians. Any AI system used in government services, hiring, lending, or healthcare should be tested for disparate impact on First Nations people before deployment.
  • Stop AI-generated misrepresentation. AI platforms should prevent the generation of imagery that claims to represent Indigenous cultures without cultural authority or accuracy.

What this means for businesses deploying AI in Australia

If you are a business implementing AI, particularly if your AI interacts with customers, makes decisions about people, or generates content, these issues are relevant to you.

  • AI-generated imagery. If you use AI to generate marketing images, check what it produces when asked to represent diverse Australian communities. If the output is stereotypical or inaccurate, do not use it.
  • Algorithmic bias in hiring. If you use AI in recruitment, test whether it produces different outcomes for candidates from different backgrounds. Under Australian anti-discrimination law, you are liable for the outputs of your AI systems regardless of intent. Our guide on AI hiring bias covers this in detail.
  • Customer-facing AI. If your AI chatbot or voice agent serves a diverse customer base, test it with diverse inputs. Does it understand different accents, dialects, and communication styles? Does it treat all customers equitably?
  • Data practices. Know where your AI training data comes from. If it includes scraped content from the internet, it likely includes Indigenous cultural material used without consent. This is an ethical issue today and may become a legal one under evolving AI governance frameworks. See our AI governance guide for more.

Responsible AI is not just about compliance. It is about building systems that work for everyone, not just the majority. And in Australia, "everyone" includes the oldest continuing cultures on Earth.

Sources

Frequently asked questions

What is the Closing the Gap Target 17?

Closing the Gap Target 17 aims to achieve equal levels of digital inclusion for Aboriginal and Torres Strait Islander people by 2026. It was set under the National Agreement on Closing the Gap. As of the 2025 Australian Digital Inclusion Index, there is still a 10.5-point gap nationally, rising to 22.8 points in very remote areas. The target is widely considered unlikely to be met by the end of 2026.

How many First Nations Australians are digitally excluded?

According to the Australian Digital Inclusion Index 2025, 40.9% of First Nations people are digitally excluded. This compares to 20.6% of all Australians. In very remote communities, the gap is even wider, with a 22.8-point difference in digital inclusion scores.

What is Indigenous data sovereignty?

Indigenous data sovereignty is the principle that First Nations peoples have the right to govern the collection, ownership, and application of data about their communities, lands, cultures, and resources. In the context of AI, it means ensuring AI systems do not use Indigenous cultural knowledge or data without free, prior, and informed consent.

How is AI being used positively in First Nations communities?

The most promising AI projects are those governed by the communities themselves. Examples include the Mamutjitji Story app for language and cultural preservation under community governance, and Kimberley-based research using closed-system AI where community members control the data. These projects treat AI as a tool for cultural preservation rather than extraction.

What are the risks of AI for Indigenous Australians?

Key risks include deepening the digital divide, algorithmic bias in government services, data colonialism (using Indigenous data without consent), cultural flattening through AI imagery that misrepresents diverse cultures, and appropriation of cultural knowledge by AI training datasets.

FW
FlowWorks
AI consulting and automation for Australian businesses
Get started

Find out what's costing
your business the most.

A 30-minute conversation. No pitch. No obligation. We'll identify your highest-impact automation opportunities before you spend a dollar.

Get your AI Readiness Review
1300 484 044 · ops@flowworks.com.au · 470 St Kilda Rd, Melbourne VIC 3004