- Pascal's Chatbot Q&As
- Posts
- In a landmark decision applauded across Australia’s creative industries, the Albanese government has ruled out introducing a Text and Data Mining (TDM) exception into its copyright law.
In a landmark decision applauded across Australia’s creative industries, the Albanese government has ruled out introducing a Text and Data Mining (TDM) exception into its copyright law.
This move prevents AI developers from freely harvesting copyrighted material—including books, news, music, TV, and content—for the purpose of training LLMs without the rights holder’s consent.
Australia Rejects TDM Exception for AI Training: A Global Template for Creator Rights?
by ChatGPT-4o
In a landmark decision applauded across Australia’s creative industries, the Albanese government has ruled out introducing a Text and Data Mining (TDM) exception into its copyright law. This move effectively prevents artificial intelligence (AI) developers from freely harvesting copyrighted material—including books, news, music, TV, and other creative content—for the purpose of training large language models (LLMs) without the rights holder’s consent. The decision signals a strong commitment to uphold copyright in the face of accelerating technological disruption, and it stands in sharp contrast to more permissive regimes seen in jurisdictions such as the United States and Japan.
This essay synthesises the responses from leading Australian creative organisations, unpacks the implications for AI development and rights management, and evaluates whether this decision sets a precedent other countries should follow.
The Decision: A Line in the Sand
The Australian government, after considerable lobbying and expert consultation, announced it would not adopt a blanket TDM exception for AI training. This decision follows recommendations from key bodies and aligns with broader copyright principles that emphasise consent, control, and compensation.
Lucy Hayward, CEO of the Australian Society of Authors (ASA), framed the ruling as a “watershed moment,” describing the prior unauthorised scraping of creative works for AI training as “the greatest act of copyright theft in history.” She called for stronger accountability mechanisms, including a mandatory code of conduct requiring AI companies to:
Disclose the sources used to train their models
Seek direct or collective licensing agreements
Provide retroactive compensation for previously ingested material
Respect Indigenous Cultural and Intellectual Property (ICIP) protocols.
Unified Support Across the Creative Sector
The response from Australia’s creative industries was unanimous and celebratory:
ARIA and PPCA (music sector) praised the government for rejecting the false dichotomy between innovation and copyright, highlighting how licensing frameworks have already supported ethical AI collaborations. They noted that “copyright and IP laws are the foundation of the creative economy, the digital economy, and the technology industry”—not obstacles to them.
Free TV Australia reinforced the importance of licensing in protecting news and broadcasting sectors from exploitation by tech giants. CEO Bridget Fair emphasised that weakening copyright laws under the guise of AI innovation would have led to “legalised content theft,” and urged for transparency rules and ACCC-led oversight of the AI market.
This widespread support illustrates the cross-sector consensus in Australia: creators want innovation, but not at the expense of the legal and economic structures that sustain their work.
Implications for AI Developers and Global IP Frameworks
This decision forces AI developers to reconsider their data acquisition strategies. Without a TDM exception, developers must engage with rights holders through negotiated agreements, which may increase licensing costs but also incentivise transparent and ethical data sourcing.
There are potential trade-offs: smaller AI startups may face barriers due to licensing complexity or cost. However, the alternative—free-for-all scraping—risks long-term damage to the creative ecosystem, cultural heritage, and public trust in AI outputs. Notably, Australia’s approach aligns with the EU’s opt-out-based TDM regime under the DSM Directive, and diverges sharply from the US’s fair use doctrine, which many tech companies rely on to justify mass ingestion of copyrighted material.
By recommitting to its Copyright and AI Reference Group (CAIRG), Australia is also investing in long-term policymaking that brings stakeholders—authors, artists, developers, First Nations leaders—into the fold.
Should Other Countries Follow Suit?
Yes—but with caveats.
Australia’s decision provides a compelling model for countries seeking to balance creative rights with AI innovation. Its strengths lie in:
Clear ethical framing: The policy affirms that creators are not raw material for technology but stakeholders in the AI economy.
Legal clarity: It prevents ambiguous interpretations that tech companies could exploit to justify unchecked scraping.
Cross-sector unity: The policy enjoys rare support from music, literature, news, and broadcasting industries, showcasing robust democratic legitimacy.
However, nations considering a similar stance must also prepare:
Robust licensing infrastructure: Collective management organisations and digital licensing platforms must be streamlined and scalable.
International harmonisation: Disparate copyright regimes could lead to data arbitrage, where AI firms train models in jurisdictions with weaker protections.
Legal enforcement: Countries must be ready to investigate, audit, and enforce compliance, especially against offshore AI developers.
Conclusion: A Blueprint for Ethical AI?
Australia’s decision to rule out a TDM exception for AI training reflects a deliberate and principled stance: innovation must not come at the cost of creative livelihoods or cultural sovereignty. In doing so, it aligns itself with ethical AI development, strengthens trust in digital platforms, and sends a clear message to both domestic and global tech firms—creators deserve a seat at the table.
Other countries should view this as more than just a copyright reform; it is a signal of values. If adopted with careful implementation, this model could serve as a global blueprint for balancing the promise of AI with the rights of those whose work underpins its capabilities. The age of unlicensed scraping must give way to an era of licensed, ethical, and accountable AI development.
