Most Australian business owners stopped paying attention to privacy law a long time ago. It felt like something only banks, telcos and hospitals had to worry about. That assumption no longer holds.
Privacy law has shifted. More changes are coming. Regulators are using sharper powers. Individuals can now sue directly for serious privacy breaches. And from 1 July 2026, an estimated 100,000+ small businesses are expected to be covered by the Privacy Act for the first time.
If your business collects customer details, stores employee records, runs a website, sends marketing emails, uses CCTV, or relies on any cloud-based system, the changes outlined below are likely to be relevant.
“Does this even apply to my business?”
The Privacy Act 1988 (Cth) (Privacy Act) has historically excluded most businesses with annual turnover under $3 million. This “small business exemption” has been one of the most generous carve-outs in the developed world — and it has made it easy for SME owners to treat privacy as someone else’s problem.
That exemption is now narrower than many business owners realise, and is set to narrow further.
A business is already covered today (regardless of turnover) if it:
- handles health information (allied health, beauty clinics, personal trainers, childcare, gyms, NDIS providers, telehealth)
- trades in personal information (sells, leases, or purchases mailing lists or customer databases)
- provides services under a Commonwealth contract
- operates a credit reporting body, residential tenancy database, or related business
- is related to a larger entity that is covered
- collects tax file numbers as part of a TFN recipient activity
From 1 July 2026, a further amendment is expected to widen the net — particularly affecting businesses that handle personal information at scale, regardless of turnover.
The second tranche of broader reforms now being progressed by the Federal Attorney-General is also widely expected to either abolish or significantly narrow the small business exemption altogether. The political momentum is unmistakable, even if the timing is not yet locked.
Importantly, even businesses that remain outside the Privacy Act can still be sued personally for serious invasions of privacy under the new statutory tort that commenced on 10 June 2025. That liability does not depend on Privacy Act coverage at all.
The practical takeaway: the days of assuming “we’re too small to worry about privacy” are ending.
Why the rules have tightened
Three years of high-profile breaches changed the political weather. Optus. Medibank. Latitude. HWL Ebsworth. Australian Clinical Labs. Millions of Australians had their personal data exposed — including, in some cases, identity documents, Medicare numbers, and sensitive medical information. Public trust in how organisations handle personal data collapsed.
The response has been a structural shift in how privacy is regulated and enforced. The model is moving from “publish a Privacy Policy on your website” to something operational — covering systems, contracts, staff training, vendor management, and incident response.
For businesses that have treated privacy as a website checkbox, the gap between what the law now expects and what they currently do is widening fast.
What “good” looks like in 2026
Eight themes capture where the law has moved. Each one converts into something a business can think about practically.
1. Specificity at the point of collection
Vague, catch-all privacy policies copied from a template are increasingly indefensible.
When personal information is collected — through a website form, a sign-up flow, a booking, a quote request, a service enquiry — the person handing over their information should be told clearly:
- what is being collected
- why it is being collected
- who else will see it (including overseas providers like Mailchimp, HubSpot, AWS, Google, Stripe)
- how they can access, correct, or complain about that information
A short, plain-English collection notice at the point of capture is often more valuable than the long Privacy Policy buried in the website footer. Businesses may want to consider whether their current sign-up flows actually meet this standard, or whether they rely on the user clicking through without meaningful disclosure.
The general direction of reform is to tighten what “consent” actually means. Implied consent, pre-ticked boxes, and broad opt-outs are increasingly being treated as inadequate, particularly for sensitive information or marketing automation.
2. Data minimisation — collect less, keep less
A strong privacy position rests on a simple principle: data that is not held cannot be lost.
It may be worth auditing what your business actually collects and asking some honest questions:
- Is date of birth genuinely needed, or is an age tick-box enough?
- Is a full copy of someone’s driver’s licence required, or is sighting and recording the number sufficient?
- Are enquiry emails being kept for three years when six months would do?
- Are old customer records, ex-employee files, and ex-supplier details still sitting in active systems?
Retention discipline reduces breach risk, lowers cloud storage costs, simplifies access requests, and produces a much stronger defence if something goes wrong.
It is worth noting that other laws (tax record-keeping, employment record-keeping, financial services obligations) often require certain records to be retained for set periods. A retention plan that reconciles these competing pressures tends to be more useful than ad-hoc deletion.
3. Security as an operational discipline, not a policy statement
The “reasonable steps” obligation under Australian Privacy Principle 11 has been clarified to expressly include both technical and organisational measures. In plain English: regulators want to see actual controls in place, not a policy claiming they exist.
For most SMEs, a sensible practical baseline to consider includes:
- multi-factor authentication on email, accounting, CRM, cloud storage, and any system holding customer data
- elimination of shared logins (each staff member has their own credentials)
- staff access limited to what they need for their role
- encrypted laptops and mobile devices
- a documented offboarding process when staff leave (access revoked the same day)
- regular, tested backups
- clear internal rules on what cannot be sent in plain email (bank details, ID documents, health information)
- basic staff training on phishing, social engineering, and incident reporting
None of this requires enterprise-grade infrastructure. Most of it is now expected as a minimum.
For businesses handling particularly sensitive information — health data, children’s data, identity documents, payment card details — the bar is higher. Standards like ISO 27001, the Essential Eight, or PCI DSS may be worth considering as reference frameworks.
4. Vendor and third-party risk
Many privacy failures happen through suppliers — booking platforms, marketing tools, outsourced IT, accountants, debt collectors, AI tools, payment gateways.
If a vendor loses your customers’ data, your customers will hold your business responsible. So will the regulator. The “we used a reputable provider” defence is rarely a good one if the contract did not impose adequate security and breach notification obligations on that provider in the first place.
Before signing or renewing with any supplier that touches personal information, businesses may want to consider:
- where the data is stored (Australia or overseas, and which countries)
- what security controls the supplier has and whether they can evidence them (SOC 2, ISO 27001, penetration test reports)
- whether the supplier is permitted to use the data for its own purposes — including training AI models
- what happens to the data when the contract ends (deletion, return, retention)
- the supplier’s contractual breach notification timeframe (hours, not days, is the modern expectation)
- audit rights and the right to inspect security practices
- liability allocation for breaches caused by the supplier
Standard supplier terms are usually written to protect the supplier, not the customer. Reviewing those terms — or negotiating addenda for critical providers — is increasingly a baseline expectation.
5. Individual rights and complaints
Individuals are becoming increasingly aware of their rights, and increasingly willing to use them. Businesses may want to plan for an increase in:
- Access requests (“send me everything you hold about me”) — these can be broad, time-consuming, and triggered by something as simple as a former employee or customer disputing a decision.
- Correction requests — particularly where personal information has been used in automated decision-making.
- Unsubscribe and deletion requests — including informal “please remove my account” requests outside the Spam Act framework.
- Complaints to the OAIC — which can lead to determinations, conciliation, or in some cases civil penalty proceedings.
A documented process — naming the person responsible, setting an internal response timeframe, and explaining how information is actually located across systems — tends to be far more useful than improvising under pressure.
The general direction of reform also points to a statutory right of erasure being introduced as part of the second tranche. This is Australia’s developing version of the European “right to be forgotten” — though its scope, exceptions, and interaction with other record-keeping laws are not yet finalised.
6. Data breach response
The Notifiable Data Breaches scheme requires notification to the OAIC and affected individuals where there is an eligible data breach — broadly, unauthorised access or disclosure of personal information likely to cause serious harm. Assessments are required to be completed “as soon as practicable” — and typically within 30 days.
In practice, the most critical decisions in a breach often need to be made within the first 72 hours: containment, evidence preservation, regulator engagement, customer communications, and (sometimes) media response.
A documented Data Breach Response Plan does not need to be long. It needs to exist, name the responsible people, set out escalation triggers, and ideally have been tested at least once before the real thing happens. Businesses that improvise their first breach response usually pay for it twice — once in the response itself, and again in the regulator’s view of how it was handled.
7. Sensitive information, biometrics, and children’s data
Sensitive information under the Privacy Act includes health information, racial or ethnic origin, political opinions, religious beliefs, sexual orientation, biometric information used for automated identification, and criminal record. Collection, use, and disclosure of sensitive information attracts stricter consent requirements and tighter scrutiny.
If a business uses biometric tools — facial recognition for access control, fingerprint scanning for time and attendance, voice ID for customer service — the privacy implications are now significant. The Bunnings facial recognition case has shown that regulators are actively investigating this space.
The OAIC is also developing a Children’s Online Privacy Code, with consultation on the Exposure Draft open during 2026. The Code is expected to apply to social media services, relevant electronic services, and designated internet services likely to be accessed by under-18s — and to set out specific obligations around consent, data minimisation, and design of children-facing services.
Any business handling children’s data — childcare providers, education platforms, gaming services, tutoring services — may want to start thinking about how the Code is likely to affect them.
8. Automated decision-making and AI
From December 2026, businesses will be required to disclose in their privacy policies if they use personal information in automated decisions that significantly affect individuals — including decisions about credit, employment, eligibility, pricing, or access to services.
This is the early Australian foothold for AI governance under privacy law. The disclosure requirement applies whether the automated decision is made by a traditional algorithm, a machine learning model, or a large language model.
For businesses already using AI tools — chatbots, automated underwriting, automated triage, AI-assisted recruitment, generative AI for customer-facing communications — it may be worth starting to document now:
- what AI systems are in use
- what personal information they process
- what decisions they influence or make
- what human oversight applies
- where the AI provider stores and uses the data
That documentation tends to be far easier to build as systems are introduced than to retro-fit later.
The real risks
Three risks are now meaningfully larger than they were two years ago.
Regulator risk. The OAIC can now issue infringement notices of up to $66,000 per contravention without going to Court. A mid-tier civil penalty has been introduced for interferences with privacy that are not “serious or repeated”, giving regulators a more accessible enforcement tool. New search, seizure, and investigation powers under the Regulatory Powers (Standard Provisions) Act are also in force. The OAIC has signalled it intends to use these powers — and the federal funding to do so has been increased.
Civil litigation risk. The new statutory tort for serious invasions of privacy (in force since 10 June 2025) provides individuals with a direct cause of action. Crucially, the plaintiff does not need to prove financial loss — damages can be awarded for hurt feelings, anxiety, and reputational harm alone. Class actions are a real possibility, and several plaintiff firms are already positioning. This exposure applies to any individual or organisation, regardless of whether they are covered by the Privacy Act.
Commercial risk. Customers ask about privacy now. Larger businesses ask about it in procurement processes. Cyber insurers ask about it in renewals — and price accordingly. A privacy incident damages relationships, contracts, and renewals long before any legal or regulatory exposure crystallises.
For directors of incorporated businesses, there is also a growing argument that failure to manage privacy risk adequately can engage personal duties under the Corporations Act — particularly where the failure causes material loss to the company.
A practical 30-day shortlist
For business owners who want a sensible starting point this quarter, the following list may be worth considering:
- Map your data. List what personal information your business collects, where it is stored, who can access it, and how long it is retained.
- Refresh your Privacy Policy and collection notices. Documents drafted before 2025 are likely to be out of date.
- Turn on multi-factor authentication. Email, accounting, CRM, cloud storage — anywhere personal information lives.
- Audit your top five vendors. Where is the data, what can they do with it, and what does the contract say?
- Draft a one-page Data Breach Response Plan. Names, contact details, decision points, and external advisers.
- Identify who owns privacy in your business. Even if it is the founder. Especially if it is the founder.
- Train your staff. A 30-minute session on phishing, password hygiene, and incident reporting can dramatically reduce risk.
- Review your AI use. Catalogue what tools are used, what data goes into them, and what decisions they influence.
None of these steps fix everything. Collectively, they put a business in materially better shape than the majority of its competitors.
The bottom line
Privacy law has moved from a back-office paperwork issue to a board-level commercial risk. The businesses that come through the next two years well are likely to be the ones that treated it as a business issue, not a legal one — building practical disciplines into how they operate, rather than waiting for a Bill, a breach, or a regulator to force the conversation.
For businesses unsure of where they currently stand, getting an early read on exposure tends to be considerably cheaper than dealing with a problem after the fact.
This article contains general information only and does not constitute legal advice. Envision Legal accepts no liability for any loss arising from reliance on this content. You should seek independent legal advice tailored to your specific circumstances. For enquiries, contact Envision Legal.
