PAW 2021 – That’s a wrap

Privacy Awareness Week is 5-9 May 2021.

In our final post for Privacy Awareness Week (PAW), we share our five highlights and observations from the week.

1. A privacy win during PAW

The timing may have been coincidental, but we’ll take it. There was a notable win for privacy this week with the Senate Committee that reviewed the Government’s inter-agency data sharing law – the Data Availability and Transparency Bill – recommending the bill not be passed in its current form, noting a need for stronger privacy protections and security measures (among other things).

Our advocacy for greater attention to the privacy risks in the bill (as part of a collaborative submission with other privacy colleagues) was quoted in the Senate Committee’s report and in the news media this week.


2. Momentum building

We were energised to hear this week just how much focus and attention there is on privacy, particularly from a regulatory perspective. At a panel of regional privacy regulators hosted by the International Association of Privacy Professionals on Tuesday, we got insight into the breadth of activity currently underway.

At the Commonwealth level, clearly the focus is on the review of the Privacy Act. The States and Territories are also running various projects to bolster privacy protections, from the privacy officers project in Victoria, mandatory breach reporting in NSW and privacy champions network in Queensland, to the focus on managing privacy in complex cross-cultural contexts in the Northern Territory.

Overseas, New Zealand is looking at improvements within its public sector, the Philippines will be launching a privacy mark and  Singapore is implementing its new data protection law.

Many of the regulators on Tuesday also expressed the view that it is time for everyday Australians to make privacy a priority and realise that every time we hand over our data, we’re not only making an individual decision but also contributing to the future fabric of our society.

3. Privacy spat!

What better way to draw attention to trust and transparency during PAW than a stoush between two technology platforms over privacy.

Signal and Facebook went at it after Signal used Facebook’s own advertising platform to create ads that exposed the categories Facebook uses to classify users. The ads appeared as placards and contained customised messages such as: “You got this ad because you’re a certified public accountant in an open relationship. This ad used your location to see you’re in South Atlanta. You’re into natural skin care and you’ve supported Cardi B since day one.”

Facebook labelled the move a stunt, while Signal claimed Facebook disabled its account as a response. Either way, fantastic timing for PAW.

4. Privacy is precious

Speaking of ads, our attention this week was drawn to New Zealand’s TV commercial for privacy, created to raise awareness of its new Privacy Act, which came into operation in December 2020. The ads feature the theme “Privacy is precious” and are at once simple to understand while being wonderfully evocative. Check it out here.

The Kiwis have a great track record of pumping out great videos to raise awareness – see the Air New Zealand air safety videos and New Zealand Government online safety ads. Perhaps it’s time to add “privacy advertisements” to the list of cross-Tasman rivalries, which already includes cricket, rugby and netball. Can Australian creatives take up the charge and create an even better pitch to help the Australian community prioritise privacy?

5. Hurray for privacy drinks

Finally, it was great to celebrate Privacy Awareness Week with an old-fashioned drink with friends and colleagues. elevenM hosted drinks at O Bar in Sydney on Wednesday night, and we were thrilled to be back together in person with so many of our valued friends, clients, partners, colleagues, and other fellow travelers in attendance.

It reminded us what a diverse and vibrant community we have and filled us with inspiration and optimism about the future, as we work together to solve some of the most complex issues of our time. Thanks to all who came, and we hope those that couldn’t will make it next time.

Privacy in focus: Towards a unified privacy regime

In this final post from our ‘Privacy in focus’ blog series we discuss changes we believe are necessary to enhance trust and confidence in the broader privacy landscape in Australia.

Thus far in the ‘Privacy in focus’ series, we have looked at the operation of the Privacy Act and its most fundamental elements: the definition of personal information, the operation of notice and consent, and organisational accountability. We have proposed changes that we believe will strengthen how privacy is managed.

In approaching our final post, we put ourselves in the shoes of the individual. What does the operation of the privacy regime look like to them? Are there gaps? Is there consistency in who is covered and who isn’t? Do they feel empowered to participate in the defence of their rights? In asking these questions, it became clear that there may be some blind-spots and loopholes undermining confidence in the broader system.

Time to close the gaps

A number of exemptions in the Privacy Act – while reasonable at the time of their inclusion – no longer appear valid given the way our economy and online environment functions. Most striking are the exemptions for small businesses and employee records.

Due to technology, small businesses today are capable of great scale, with many handling significant volumes of personal information. Small businesses account for 97% of all Australian businesses (by employee size) and at least a third of the value of the Australian economy. Excluding such a large chunk of the economy from privacy regulation not only places many individuals at risk of potential harm, it erodes trust in our privacy regime, is out of step with international standards and contributes to why we are inadequate under GDPR.

In saying this, many small businesses today already comply with overseas privacy regimes (that do not include a small business exemption) because they also service international customers. Removing the small business exemption may provide many such businesses with consistency and reduce friction in international transactions.

The changing times have also left employee records and political exemptions behind. Due to the convergence of digital experiences inside and outside the workplace, it’s likely many individuals have the same expectations of privacy from their employer as they do of businesses they deal with outside the workplace. The emergence of workplace surveillance technologies also further underscores the need for better protections for employee information.

On the political front, developments such as Cambridge Analytica, growing voter databases and unsolicited campaign messaging have clearly cast a shadow over the validity of exemptions for political parties.

As a final note, the Australian Community Attitudes to Privacy Survey 2020 found almost three-quarters of Australians feel exempt organisations “should be required to protect personal information in the same ways that government and larger businesses are required to”.

Two final strings to the bow

Earlier in the series, we made the case that consumers are not truly empowered to manage their own privacy – largely because the idea that we can make rational decisions about future informational harms, especially in increasingly complex digital ecosystems, is a deeply flawed premise.

This lack of empowerment is even more explicit when things go wrong. Firstly, many privacy wrongs are not covered by the Privacy Act at all. Absent a general ‘privacy tort’ or independent statutory cause of action, individuals lack the ability to take to court serious invasions of privacy that don’t involve the Privacy Act, such as such as unauthorised surveillance or infringements on the privacy of communications.

Even when an infringements is covered by the Act (ie. matters of data protection by organisations), individuals presently have no direct recourse. They must rely on privacy regulators – such as the Office of the Australian Information Commissioner – to act against the violating entity on their behalf.  This makes a resource-strained regulator something of a gatekeeper.

It seems logical that a key part of empowering individuals to protect their privacy would be to give them the right to seek judicial redress when harmed. Tellingly, 78% of Australians believe that they should have the right to seek compensation in the courts for a breach of privacy. Indeed, the need for a statutory cause of action for invasions of privacy has been canvassed previously, including by the Australian Law Reform Commission (ARLC) and The Australian Competition and Consumer Commission’s Digital Platforms Inquiry.

There does need to be careful consideration of how a direct right of action is designed. Much has been written about the deep pockets required to approach the courts, and the counter argument that the courts could be overwhelmed by trivial privacy complaints, to say nothing of the fear of the business community in being dragged into ‘nuisance lawsuits’. Options such as a threshold test and procedural considerations have been contemplated in detail, including by the ARLC in its 2013 issues paper and by elevenM in our direct right of action research paper.

Introducing a direct right of action might also shift sentiment, drawing individuals to think more keenly about their own privacy. A direct right of action also amplifies the idea that privacy is of value in our society, particularly once individuals pursuing actions becomes visible. As argued by US lawyer Yosef Getachew in making the case for a right of action in the US, a direct right of action is “an extension of democratic participation, like petitioning government, writing members of Congress, and talking to state legislators.”

Towards a unified system

As we’ve worked our way through this blog series on the review of the Privacy Act, what has emerged clearly is that the legislation remains is no longer in-step with the modern digital environment, and that reform is necessary.

This includes changes at the level of definitions of terms like personal information. Equally important is understanding whether measures like consent and notice remain effective and durable as digital interactions become more complex. When we consider the roles of individuals and organisations, we also must ask: how do we strike the right balance of accountabilities? Should we reframe where the onus of responsibility sits?

In short, we seek to imagine a more unified system in which all parts – individuals, organisations, regulators and the regulations – work together to deliver a digital environment marked by consistently good privacy and high levels of trust. We’re confident many of these issues will be contemplated seriously by a revised Privacy Act, and hope this series has been a constructive and informative contribution.


Read all posts from the Privacy in focus series:
Privacy in focus: A new beginning
Privacy in focus: Who’s in the room?
Privacy in focus: What’s in a word?
Privacy in focus: The consent catch-22
Privacy in focus: A pub test for privacy
Privacy in focus: Towards a unified privacy regime

Privacy in focus: A pub test for privacy

In this instalment of our ‘Privacy in focus’ blog series, we look beyond consent and explore other ideas that could make privacy easier and more manageable.

In our last post, without mentioning milkshakes, we talked about how central consent has become in the regulation of privacy and how putting so much weight on individuals’ choices can be problematic.

This time, we’re into solution mode. How can we make privacy choices easier? How might we start moving away from consent as the touchstone? What might privacy law look like if it didn’t rely so heavily on individuals to monitor and control how their information is used?

Start where you are

It is likely that notice and consent will always be a critical piece of the privacy puzzle, so before we start talking about evolving our entire regulatory model, we should probably cover what might be done to improve our current approach.

Last time, we identified four related ways in which individual privacy choices get compromised:

  • we don’t have enough time
  • we don’t have enough expertise
  • we behave irrationally
  • we are manipulated by system designers.

So what can we do to address these shortcomings?

We can raise the bar — rule out reliance on consent in circumstances where individuals are rushed, do not understand or have not truly considered their options, or in truth do not have any options at all.[1] Raising the bar would rule out a range of practices that are currently commonplace, such as seeking consent for broad and undefined future uses, or where there is a substantial imbalance of power between the parties (such as in an employment relationship). It would also rule out what is currently a very common practice of digital platforms — requiring consent to unrelated secondary uses of personal information (such as profiling, advertising and sale) as a condition of service or access to the platform.

We can demand better designclearer, shorter and more intelligible privacy communications, perhaps even using standardised language and icons. Apple’s recently adopted privacy ‘nutrition labels’ for iPhone apps are a great example of what this can look like in practice, but we needn’t stop there — there is a whole field of study in legal information design and established practices and regulatory requirements from other industries (such as financial services product disclosures) which could be drawn on.

We can ban specific bad practicesmanipulative and exploitative behaviours that should be prohibited. Australian Consumer Law goes some way to doing this already, for example by prohibiting misleading and deceptive conduct (as the ACCC’s recent victory against Google will attest). But we could go further, for example by following California in specifically prohibiting ‘dark patterns’ — language, visual design, unnecessary steps or other features intended to push users into agreeing to something they wouldn’t otherwise agree to. Another, related option is to require privacy protective default settings to prevent firms from leveraging the default effect to push users towards disclosing more than they would like.

Who should take responsibility for safety?

But even if we did all of the above (and we should), taking responsibility for our own privacy in a world that is built to track our every move is still an impossibly big ask. Instead of expecting individuals to make the right choices to protect themselves from harmful data practices, the Privacy Act should do more to keep people safe and ensure organisations do the right thing.

What would that look like in practice? Focusing on organisational accountability and harm prevention would mean treating privacy a bit more like product safety, or the safety of our built environment. In these contexts, regulatory design is less about how to enable consumer choice and more about identifying who is best equipped to take responsibility for the safety of a thing and how best to motivate that party to do so.

Without strict safety requirements on their products, manufacturers and builders may be incentivised to cut corners or take unnecessary risks. But for the reasons we’ve already discussed, it doesn’t make sense to look to consumers to establish or enforce these kinds of requirements.

Take the safety of a children’s toy, for example. What is more likely to yield the optimal outcome – having experts establish standards for safety and quality (eg: for non-toxic paints and plastics, part size, etc) which manufacturers must meet to access the Australian market, and against which products will be tested by a well-resourced regulator? Or leaving it to the market and having every individual, time-poor consumer assess safety for themselves at the time of purchase, based on their limited knowledge of the product’s inner workings?

Whenever we can identify practices that are dangerous or harmful, it is far more effective and efficient to centralise responsibility in the producer and establish strong, well-funded regulators to set and check safety standards. We don’t expect individual consumers to check for themselves whether the products they buy are safe to use, or whether a building is safe to enter.

Why should privacy be any different?

Just like with buildings or physical goods, we should be able to take a certain level of safety for granted with respect to our privacy. Where a collection, use or disclosure of personal information is clearly and universally harmful, the Privacy Act should prohibit it. It should not fall to the user to identify and avoid or mitigate that harm.

Privacy laws in other jurisdictions do this. Canada, for example, requires any handling of personal information to be ‘for purposes that a reasonable person would consider appropriate in the circumstances. In Europe under the GDPR, personal data must be processed ‘fairly’. Both requirements have the effect of prohibiting or restricting the most harmful uses of personal information.

However, under our current Privacy Act, we have no such protection. There’s nothing in the Privacy Act that would stop, for example, an organisation publishing personal information, including addresses and photos, to facilitate stalking and targeting of individuals (provided they collected the information for that purpose). Similarly, there’s nothing in the Privacy Act that would stop an organisation using personal information to identify and target vulnerable individuals with exploitative content (such as gambling advertising).[2] The APPs do surprisingly little to prohibit unfair or unreasonable use and disclosure of personal information, even where it does not meet community expectations or may cause harm to individuals.

A pub test for privacy

It is past time that changed. We need a pub test for privacy. Or more formally, an overarching requirement that any collection, use or disclosure of personal information must be fair and reasonable in all the circumstances.

For organisations, the burden of this new requirement would be limited. Fairness and reasonableness are well established legal standards, and the kind of analysis required — taking into account the broader circumstances surrounding a practice such as community expectations and any potential for harm — is already routinely conducted in the course of a Privacy Impact Assessment (a standard process used in many organisations to identity and minimise the privacy impacts of projects). Fairness and reasonableness present a low bar, which the vast majority of businesses and business practices clear easily.

But for individuals, stronger baseline protections present real and substantial benefits. A pub test would rule out the most exploitative data practices and provide a basis for trust by shifting some responsibility for avoiding harm onto organisations. This lowers the level of vigilance required to protect against everyday privacy harms — so I don’t need to read a privacy policy to check whether my flashlight app will collect and share my location information, for example. It also helps to build trust in privacy protections themselves by bringing the law closer into line with community expectations — if an act or practice feels wrong, there’s a better chance that it will be.

The ultimate goal

The goal here — of both consent reforms and a pub test — is make privacy easier for everyone. To create a world where individuals don’t need to read the privacy policy or understand how cookies work or navigate complex settings and disclosures just to avoid being tracked. Where we can simply trust that the organisations we’re dealing with aren’t doing anything crazy with our data, just as we can trust that the builders of a skyscraper aren’t doing anything crazy with the foundations. And to create a world where this clearer and globally consistent set of expectations also makes life easier for organisations.

These changes are not revolutionary, and they might not get us to that world immediately, but they are an important step along the path, and similar measures have been effective in driving better practices in other jurisdictions.

The review of the Privacy Act is not only an opportunity to bring us back in line with international best practice, but also an opportunity to make privacy easier and more manageable for us all.


Read all posts from the Privacy in focus series:
Privacy in focus: A new beginning
Privacy in focus: Who’s in the room?
Privacy in focus: What’s in a word?
Privacy in focus: The consent catch-22
Privacy in focus: A pub test for privacy

 


[1] In it’s submission on the issues paper, the OAIC recommends amending the definition of consent to require ‘a clear affirmative act that is freely given, specific, current, unambiguous and informed’.

[2] These examples are drawn from the OAIC’s submission to the Privacy Act Review Issues Paper – see pages 84-88.

Privacy in focus: The consent catch-22

In this post from our ‘Privacy in focus’ blog series we discuss notice and consent — key cornerstones of privacy regulation both in Australia and around the globe — and key challenges in how these concepts operate to protect privacy.

From the 22 questions on notice, consent, and use and disclosure in the Privacy Act issues paper, there is one underlying question: Who should bear responsibility for safeguarding individuals’ privacy?

Privacy in focus: What’s in a word?

In this post from our ‘Privacy in focus’ blog series, we explore arguments for and against changes to the definition of personal information being considered by the review of the Privacy Act, and the implications of those changes.

One of the simplest but most far-reaching potential amendments to the Privacy Act is the replacement of a single word: replacing ‘about’ with ‘relates to’ in the definition of ‘personal information’.

Supporters of the change (such as the ACCC, the OAIC, and the Law Council of Australia) say it would clarify significant legal uncertainty, while also aligning Australia with the GDPR standard and maintaining consistency between the Privacy Act and the Consumer Data Right regime.

Those opposed (such as the Communications Alliance and the Australian Industry Group) warn that the change may unnecessarily broaden the scope of the Act, potentially imposing substantial costs on industry without any clear benefit to consumers.

To understand why, we’ll dig into the origins of the definition and the present uncertainty regarding its application.

Precision is important

The definition of personal information sets the scope of the Privacy Act. All the rights and obligations in the Act rely on this definition. All the obligations that organisations have to handle personal information responsibly rely on this definition. All the rights that individuals have to control how their personal information is used rely on this definition.  Personal information is the very base on which privacy regulation rests.

Any uncertainty in such an important definition can result in significant costs for both individuals and organisations. At best, uncertainty can result in wasted compliance work governing and controlling data that need not be protected. At worst, it can mean severe violations of privacy for consumers when data breaches occur as a result of failure to apply controls to data that should have been protected. Examples of the former are frequent — even OAIC guidance encourages organisations to err on the side of caution in identifying data as personal information. Unfortunately, examples of the latter are even more commonplace — the disclosure of Myki travel data by Public Transport Victoria, the publication of MBS/PBS data by the Federal Department of Health, and Flight Centre’s release of customer data for a hackathon are all recent examples of organisations releasing data subject to inadequate controls in the belief that it did not amount to personal information.

These uncertain times

According to the OAIC, the ACCC, and many others, there is substantial uncertainty as to the scope of ‘personal information’, particularly as it relates to metadata such as IP addresses and other technical information. That uncertainty was partially created, and certainly enhanced, by the decision of the Administrative Appeal Tribunal in the Grubb case, which was upheld on appeal in the Federal Court.

In the Grubb case, the Tribunal found that certain telecommunications metadata was not personal information because it was really ‘about’ the way data flows through Telstra’s network in order to deliver a call or message, rather than about Mr Grubb himself.

The ruling came as a surprise to many. The orthodoxy up until that point had been that the word ‘about’ played a minimal role in the definition of personal information, and that the relevant test was simply whether the information is connected or related to an individual in a way that reveals or conveys something about them, even where the information may be several steps removed from the individual.

Today, it’s still unclear how significant a role ‘about’ should play in the definition. Could one argue, for example, that location data from a mobile phone is information about the phone, not its owner? Or that web browsing history is information about data flows and connections between computers, rather than about the individual at the keyboard?

OAIC guidance is some help, but it’s not legally binding. In the absence of further consideration by the courts, which is unlikely to happen any time soon[1], the matter remains unsettled. Organisations are without a clear answer as to whether (or in what circumstances) technical data should be treated as personal, forcing them to roll the dice in an area that should be precisely defined. Individuals are put in the equally uncertain position of not knowing what information will be protected, and how far to trust organisations who may be trying to do the right thing.  

Relating to uncertainty

Those in favour of reform want to resolve this uncertainty by replacing ‘about’ with ‘relates to’. The effect would be to sidestep the Grubb judgement and lock in a broad understanding of what personal information entails, so that the definition covers (and the Privacy Act protects) all information that reveals or conveys something about an individual, including device or technical data that may be generated at a remove.

Those who prefer the status quo take the view the present level of uncertainty is manageable, and that revising the definition to something new and untested in Australia may lead to more confusion rather than less. Additionally, there is concern that ‘relates to’ may represent a broader test, and that the change could mean a significant expansion of the scope of the Act into technical and operational data sets.

What we think

By drawing attention to ‘about’ as a separate test, the Grubb case has led to an unfortunate focus on how information is generated and its proximity to an individual, when the key concern of privacy should always be what is revealed or conveyed about a person. In our view, replacing ‘about’ with ‘relates to’ better focuses consideration on whether an identifiable individual may be affected.

Industry concerns about expanding the scope of the Act are reasonable, particularly in the telco space, though we anticipate this to be modest and manageable as the scope of personal information will always remain bounded by the primary requirement that personal information be linked back to an identifiable individual. Further, we anticipate that any additional compliance costs will be offset by a clearer test and better alignment with the Consumer Data Right and Telecommunications (Interception and Access) Act, both of which use ‘relates to’ in defining personal information.

Finally and significantly for any businesses operating outside of Australia, amending ‘about’ to ‘relates to’ would align the Privacy Act more closely with GDPR. Aligning with GDPR will be something of a recurring theme in any discussions about the Privacy Act review. This is for two reasons:

  • GDPR is an attractive standard. GDPR has come to represent the de-facto global standard with which many Australian and most international enterprises already comply. It’s far from perfect, and there are plenty of adaptations we might want to make for an Australian environment, but generally aligning to that standard could achieve a high level of privacy protection while minimising additional compliance costs for business.
  • Alignment might lead to ‘adequacy’. The GDPR imposes fewer requirements on data transfers to jurisdictions that the EU determine to have ‘adequate’ privacy laws. A determination of adequacy would substantially lower transaction and compliance costs for Australian companies doing business with the EU.

Click ‘I agree’ to continue

In our next edition of the Privacy in Focus series, we’ll take a look at consent and the role it might play in a revised Privacy Act. Will Australia double down on privacy self-management, or join the global trend towards greater organisational accountability?

Footnote: [1] Because of the way that privacy complaints work, disputes about the Privacy Act very rarely make it before the courts — a fact we’ll dig into more when we cover the proposal for a direct right of action under the Act.


Read all posts from the Privacy in focus series:
Privacy in focus: A new beginning
Privacy in focus: Who’s in the room?
Privacy in focus: What’s in a word?
Privacy in focus: The consent catch-22
Privacy in focus: A pub test for privacy
Privacy in focus: Towards a unified privacy regime

Privacy in focus: Who’s in the room?

In this post from our ‘Privacy in focus’ blog series, we explore the key voices and perspectives shaping the review of the Privacy Act.

If you want to know where the review of the Privacy Act is going to land, the first question to ask is ‘who’s in the room’.

That’s why, in this post on the Privacy Act review, we’ve analysed public submissions in response to the Government’s issues paper to see what they reveal about the perspectives of interest groups, and how this might shape the review process.

It’s loud in here

There are 154 submissions published on the Attorney General’s website, totalling 2,103 pages by our count. That’s quite a few by comparison with other consultation processes. The ACCC’s Digital Platforms Inquiry issues paper only attracted 76 submissions.

More than half of all submissions come from private companies (around 30%) and industry bodies or professional associations (around 23%). Within this segment, a wide range of industries are represented – it really is a cross section of the economy. Contributions from the Shopping Centre Council of Australia, the Obesity Policy Coalition and the Federal Chamber of Automotive Industries might have been surprising a few years ago. Today their presence is a testament to how central data has become in our lives.

The remaining submissions come from academics and research centres (around 16%), various government entities (around 13%), charities and advocacy groups (around 10%) and individuals (around 7%).

Reading the room

There are so many issues and so many differing interests and perspectives that it is difficult to draw many clear through-lines. By our rough (and inevitably subjective) count:

  • A little over 50% of all submissions are broadly in support of stronger privacy protections.
  • Around 20% advocate little or no change to the current regime.
  • The remainder are either explicitly neutral, focus on a specific issue or provide commentary on a specific industry without taking a clear position.
  • Only a small handful of submissions advocate for weaker protections.

What’s the chatter?

The small business and employee records exemptions are shaping up as a key battleground, with an unlikely alliance between privacy advocates (Electronic Frontiers Australia, New South Wales Council for Civil Liberties) and tech/data companies (Google, Data Republic) against the exemptions on one side, pitted against representatives of small business and sole traders in a range of fields (Arts Law Centre of Australia, Clubs Australia and the Australian Small Business and Family Enterprise Ombudsman) favouring the exemption on the other.

The role of consent will be another area of contention. A large number of submissions have raised concerns about the ACCC Digital Platforms Inquiry recommendations for enhanced consent requirements. Some note the failure of the notice and consent model as a whole and emphasise the need for additional controls on how organisations use data (see particularly the Consumer Policy Research Centre and the Association for data-driven marketing and advertising). Others emphasise the dangers of consent fatigue and the need for an alternative basis for processing (see e.g., Facebook).

Finding your friends – opposing unnecessary regulation

As one might expect, submissions from industry are more likely to oppose or raise concerns about higher regulatory standards. Those worried about the potential costs of reform include:

Finding your friends – supporting higher standards

Perhaps surprisingly, many of the most data centric businesses and industry groups support reform. Data service providers (such as illion and Experian), advertisers (such as the Association for data-driven marketing and advertising), and software/technology services (such as Adobe, Atlassian, Data Republic) are much more open to reform, particularly in pursuit of GDPR adequacy.

Submissions from human rights groups (such as Digital Rights Watch, New South Wales Council for Civil Liberties) and consumer advocacy groups (such as Australian Communications Consumer Action Network, CHOICE, Financial Rights Legal Centre, Consumer Action Law Centre and Financial Counselling Australia) near-universally support greater protections, as do academics (such as the Centre for AI and Digital Ethics and Melbourne Law School, University of Melbourne, Castan Centre for Human Rights Law – Monash University) and professionals (such as Australian Information Security Association and the Law Council of Australia) also skew heavily towards stronger privacy rules.

What next?

Our takeaway is that there are substantially more voices in favour of reform than for the status quo. Add that to the overwhelming public support for stronger privacy protections (83% of Australians surveyed by OAIC saying they would like the government to do more to protect the privacy of their data) and it looks like there will be real pressure on the government to deliver meaningful reform.

Of course, the issues paper is just the beginning, and we’ve just scratched the surface here. So why not stay tuned while we all wait for the discussion paper? In our next edition, we’ll take a deep dive into the definition of personal information.


Read all posts from the Privacy in focus series:
Privacy in focus: A new beginning
Privacy in focus: Who’s in the room?
Privacy in focus: What’s in a word?
Privacy in focus: The consent catch-22
Privacy in focus: A pub test for privacy
Privacy in focus: Towards a unified privacy regime