As a national, membership-based, not-for-profit organisation that promotes and protects human rights in a digital context, EFA is deeply concerned by persistent data privacy and security issues within the National Disability Insurance Scheme (NDIS).
Historical budget cuts and sweeping plan adjustments made under the NDIS PACE system in 2023 have worsened an already compromised situation, including serious implications for the privacy and security of sensitive participant data, which should concern all Australians.
The NDIS has a long-standing record of inadequate data privacy protections. As two examples, here is NDIA’s statement regarding a NDIS data breach that occurred in November 2023 and in 2022, a CTARS security breach, a cloud-based client management system used by NDIS service providers.
For years, reports from the Australian National Audit Office (ANAO) and various inquiries (including former head of the NDIS Technology Authority Marie Johnson’s testimony at the NDIS Bill Senate Inquiry Hearing) have highlighted major deficiencies in the Agency’s handling of participant information, a lot of it being very sensitive and very private.
The introduction of the PACE system, which stores sensitive participant data offshore in Salesforce’s German data centre, raises further concerns about the increased risk of a data breach arising from the offshoring of personal data. Recent FOI requests reveal that the National Disability Insurance Agency (NDIA) has struggled to provide proof of data security assurances, leaving critical questions unanswered. This apparent lack of due diligence risks both participant privacy and public trust in the Agency’s ability to protect highly sensitive data.
It also appears that participants are being pressured by the NDIA to use the myNDIS app, which collects extensive personal and sensitive data, including navigation and identity verification information. Data sharing by the app with Services Australia is framed as essential, but this broad access exposes sensitive information to a higher risk of compromise or misuse. The app’s assurances to take “reasonable steps” for data security are deliberately vague, lack concrete commitments to encryption or regular security reviews, leaving privacy protections weak. In all, it looks like cyber-washing.
Although the app allows users to access and correct their data, it lacks an option for deletion — a standard privacy control in many frameworks. Unnecessarily long or Indefinite data retention, especially in a tool intended for fraud detection, undermines privacy rights, increases data breach risk and consequences, and raises concerns about excessive government monitoring without clear justification. A shift to “privacy by design” would address these issues by minimising data collection and incorporating stronger security, deletion and data anonymisation practices.
The absence of effective risk management within the NDIA is equally concerning. The ANAO audit reveals failures in handling complaints that pose immediate risks to participant safety, adding to a pattern of weak data protections and eroding confidence in a system meant to provide crucial support.
EFA is also concerned by the NDIA’s increasing dependence on automated decision-making tools, incorporating machine learning and AI platforms like the Budget Calculation Instrument, echoing issues associated with roboNDIS and robodebt.(For more insight, see Marie Johnson’s Substack article: A Look Inside the Deadly Digital Transformation Agenda that created RoboDebt and RoboNDIS)
Putting to one side the integrity of the design and training of the relevant machine learning or AI system, over reliance on algorithmic assessments without adequate human oversight risks arbitrary decisions based on statistical averages rather than meeting individual needs, contravening Australia’s obligations under the UN Convention on the Rights of Persons with Disabilities (UNCRPD).
Despite the touted benefits of PACE, ongoing issues—including data migration failures, frequent service outages, and security gaps—indicate that it falls short of the protections and service levels that participants need. Serious concerns remain about PACE’s integrity and security, increasing the risk of data breaches and further eroding trust in the NDIS.
EFA emphasises that over 600,000 Australians rely on the NDIS for essential support. It is imperative for the NDIA to prioritise data privacy, risk management, and system integrity. We note the government has budgeted circa $170m for NDIA fraud prevention capability uplift, and this is clearly reasonable, but we have not found any budget item dealing with the privacy/security remediation of PACE or for the ML and AI platforms in the NDIS ecosystem that fail customers due to their inherent flaws in substantive fairness, accountability, transparency and by removing human participation in decision making.
With the election almost upon us, demand to know what the government will do to ensure the safety of all people and providers using the defective PACE system – and the yet to be constructed Entitlement Calculation Engine, built on experimental unproven algorithms, which does not exist anywhere in the world.
About EFA
EFA is a national, membership-based, not-for-profit organisation that promotes and protects human rights in a digital context.
(Image credit: Unsplash)
Related Items:
- EFA's Response to 2025-2026 Federal Budget 4 April 2025
- Latitude Financial Data Breach Proves Privacy Must… 27 March 2023
- Albanese Government caves (again) to Big Banks and Big Tech 12 July 2024
- EFA Statement on Optus Data Breach 23 September 2022
- Our Statement of Solidarity with Gaza 13 November 2023