One in five Australians is digitally excluded. For First Nations people, it is closer to one in 2.5. That is not a historical statistic. It is the state of things right now, in 2026, according to the Australian Digital Inclusion Index 2025, published by RMIT, Swinburne University, and Telstra.
As AI becomes embedded in how businesses operate, how government services are delivered, and how people access information, the digital divide risks becoming an AI divide. Communities that cannot access the internet reliably will not benefit from AI. Worse, AI systems trained without Indigenous perspectives risk making decisions about First Nations people without understanding or consent.
This article looks at the data on where things stand, the specific risks AI poses for Indigenous communities, the projects that are getting it right, and what businesses deploying AI in Australia should be aware of.
The Australian Digital Inclusion Index measures three dimensions: access (affordability and connectivity), ability (digital skills), and attitudes (confidence and motivation). It tracks these across demographics, geography, and socioeconomic status.
The 2025 data paints a clear picture:
Sources: Australian Digital Inclusion Index 2025; RMIT, December 2025
There is some progress. Between 2022 and 2024, daily internet use in remote First Nations communities rose from 44% to 62%. Digital ability scores in very remote areas improved by 8.7 points (from 45.8 to 54.5). The Australian Government announced free public Wi-Fi in 53 remote communities.
But the Closing the Gap Target 17, which aimed for equal digital inclusion by 2026, is widely considered unlikely to be met. A 10.5-point national gap does not close in a year. And as the Australian Academy of Humanities pointed out, there is little systematic data collection to even track progress properly.
Digital inclusion is not just about whether someone has a phone or laptop. It is about whether they can afford to stay connected without sacrificing other essentials.
According to research published by ACS Information Age, 53% of First Nations people have gone without food or skipped paying bills in order to maintain their internet or phone connection. That is not a connectivity problem. That is a poverty problem intersecting with a digital dependency problem.
When government services increasingly move online, when job applications require internet access, when telehealth replaces in-person consultations, staying connected is not optional. It is a prerequisite for participation in modern life. And for more than half of First Nations Australians, that participation comes at the cost of basic needs.
The Lowitja Institute's December 2025 report on the future impact of AI on First Nations communities identified three categories of risk:
These risks are not hypothetical. They are happening now.
A February 2026 study from Murdoch University, published in The Conversation, examined what happens when you ask AI image generators to create pictures of "Indigenous Australians."
The results were damaging. Adobe's platform hosted AI-generated images labelled as "Indigenous Australian" that bore no resemblance to real people. Midjourney produced images of "Indigenous Australians" that more closely resembled stereotypical depictions of African tribespeople. The AI had no understanding of the fact that there are more than 250 distinct Indigenous language groups in Australia, each with their own cultural identity, practices, and connection to specific lands.
The researchers used the term "cultural flattening" to describe what happens when AI reduces the extraordinary diversity of Indigenous cultures into a single, generic, and often inaccurate visual. They also called it a new form of "technological colonialism," where the tools of the dominant culture are used to overwrite and misrepresent the cultures they were never designed to understand.
For businesses using AI to generate marketing imagery, this is worth understanding. If your AI tool cannot distinguish between 250+ distinct cultural groups, it should not be generating images that claim to represent any of them.
The Lowitja Institute report found that the most promising AI projects involving First Nations communities are those "taken up on community terms." The common thread is governance: the community decides what data is collected, how it is used, and who benefits from it.
Developed through Deakin University's Abundant Intelligence project, this app uses AI for language and cultural preservation. The critical difference from mainstream AI applications: community governance determines what is recorded, how it is stored, and who can access it. The AI serves the community's goals, not the other way around.
Research from the University of Western Australia, published in The Conversation in March 2026, describes a project in the Kimberley region where AI is used within a closed system governed entirely by the community. The data never leaves community control. The researchers note that general-purpose AI like ChatGPT has "no understanding of cultural protocols or data sovereignty," but that Indigenous cultures, with their deep oral knowledge traditions, may actually be "natural users" of AI when the system is designed on their terms.
Both projects share a principle: the community is not the subject of the AI. The community is the owner. This is the core of Indigenous data sovereignty, and it is the standard against which any AI project involving First Nations communities should be measured.
Across the sources reviewed for this article, several recommendations appear consistently:
If you are a business implementing AI, particularly if your AI interacts with customers, makes decisions about people, or generates content, these issues are relevant to you.
Responsible AI is not just about compliance. It is about building systems that work for everyone, not just the majority. And in Australia, "everyone" includes the oldest continuing cultures on Earth.
Closing the Gap Target 17 aims to achieve equal levels of digital inclusion for Aboriginal and Torres Strait Islander people by 2026. It was set under the National Agreement on Closing the Gap. As of the 2025 Australian Digital Inclusion Index, there is still a 10.5-point gap nationally, rising to 22.8 points in very remote areas. The target is widely considered unlikely to be met by the end of 2026.
According to the Australian Digital Inclusion Index 2025, 40.9% of First Nations people are digitally excluded. This compares to 20.6% of all Australians. In very remote communities, the gap is even wider, with a 22.8-point difference in digital inclusion scores.
Indigenous data sovereignty is the principle that First Nations peoples have the right to govern the collection, ownership, and application of data about their communities, lands, cultures, and resources. In the context of AI, it means ensuring AI systems do not use Indigenous cultural knowledge or data without free, prior, and informed consent.
The most promising AI projects are those governed by the communities themselves. Examples include the Mamutjitji Story app for language and cultural preservation under community governance, and Kimberley-based research using closed-system AI where community members control the data. These projects treat AI as a tool for cultural preservation rather than extraction.
Key risks include deepening the digital divide, algorithmic bias in government services, data colonialism (using Indigenous data without consent), cultural flattening through AI imagery that misrepresents diverse cultures, and appropriation of cultural knowledge by AI training datasets.