Privacy in focus: Who’s in the room?

In this post from our ‘Privacy in focus’ blog series, we explore the key voices and perspectives shaping the review of the Privacy Act.

If you want to know where the review of the Privacy Act is going to land, the first question to ask is ‘who’s in the room’.

That’s why, in this post on the Privacy Act review, we’ve analysed public submissions in response to the Government’s issues paper to see what they reveal about the perspectives of interest groups, and how this might shape the review process.

It’s loud in here

There are 154 submissions published on the Attorney General’s website, totalling 2,103 pages by our count. That’s quite a few by comparison with other consultation processes. The ACCC’s Digital Platforms Inquiry issues paper only attracted 76 submissions.

More than half of all submissions come from private companies (around 30%) and industry bodies or professional associations (around 23%). Within this segment, a wide range of industries are represented – it really is a cross section of the economy. Contributions from the Shopping Centre Council of Australia, the Obesity Policy Coalition and the Federal Chamber of Automotive Industries might have been surprising a few years ago. Today their presence is a testament to how central data has become in our lives.

The remaining submissions come from academics and research centres (around 16%), various government entities (around 13%), charities and advocacy groups (around 10%) and individuals (around 7%).

Reading the room

There are so many issues and so many differing interests and perspectives that it is difficult to draw many clear through-lines. By our rough (and inevitably subjective) count:

  • A little over 50% of all submissions are broadly in support of stronger privacy protections.
  • Around 20% advocate little or no change to the current regime.
  • The remainder are either explicitly neutral, focus on a specific issue or provide commentary on a specific industry without taking a clear position.
  • Only a small handful of submissions advocate for weaker protections.

What’s the chatter?

The small business and employee records exemptions are shaping up as a key battleground, with an unlikely alliance between privacy advocates (Electronic Frontiers Australia, New South Wales Council for Civil Liberties) and tech/data companies (Google, Data Republic) against the exemptions on one side, pitted against representatives of small business and sole traders in a range of fields (Arts Law Centre of Australia, Clubs Australia and the Australian Small Business and Family Enterprise Ombudsman) favouring the exemption on the other.

The role of consent will be another area of contention. A large number of submissions have raised concerns about the ACCC Digital Platforms Inquiry recommendations for enhanced consent requirements. Some note the failure of the notice and consent model as a whole and emphasise the need for additional controls on how organisations use data (see particularly the Consumer Policy Research Centre and the Association for data-driven marketing and advertising). Others emphasise the dangers of consent fatigue and the need for an alternative basis for processing (see e.g., Facebook).

Finding your friends – opposing unnecessary regulation

As one might expect, submissions from industry are more likely to oppose or raise concerns about higher regulatory standards. Those worried about the potential costs of reform include:

Finding your friends – supporting higher standards

Perhaps surprisingly, many of the most data centric businesses and industry groups support reform. Data service providers (such as illion and Experian), advertisers (such as the Association for data-driven marketing and advertising), and software/technology services (such as Adobe, Atlassian, Data Republic) are much more open to reform, particularly in pursuit of GDPR adequacy.

Submissions from human rights groups (such as Digital Rights Watch, New South Wales Council for Civil Liberties) and consumer advocacy groups (such as Australian Communications Consumer Action Network, CHOICE, Financial Rights Legal Centre, Consumer Action Law Centre and Financial Counselling Australia) near-universally support greater protections, as do academics (such as the Centre for AI and Digital Ethics and Melbourne Law School, University of Melbourne, Castan Centre for Human Rights Law – Monash University) and professionals (such as Australian Information Security Association and the Law Council of Australia) also skew heavily towards stronger privacy rules.

What next?

Our takeaway is that there are substantially more voices in favour of reform than for the status quo. Add that to the overwhelming public support for stronger privacy protections (83% of Australians surveyed by OAIC saying they would like the government to do more to protect the privacy of their data) and it looks like there will be real pressure on the government to deliver meaningful reform.

Of course, the issues paper is just the beginning, and we’ve just scratched the surface here. So why not stay tuned while we all wait for the discussion paper? In our next edition, we’ll take a deep dive into the definition of personal information.


Read all posts from the Privacy in focus series:
Privacy in focus: A new beginning
Privacy in focus: Who’s in the room?
Privacy in focus: What’s in a word?
Privacy in focus: The consent catch-22

When your milk comes with a free iris scan

elevenM’s Melanie Marks’ regular trip to the supermarket brings her face-to-face with emerging privacy issues.

A couple of weeks ago, as I was nonchalantly scanning my groceries, I looked up and was shocked to see a masked face staring back at me. 

After I realised it was my own face, fright turned to relief and then dismay as it hit me that the supermarkets had – without consultation, and with limited transparency – taken away my freedom to be an anonymous shopper buying milk on a Sunday.

Just days later, the press outed Coles for its introduction of cameras at self-service checkouts. Coles justified its roll-out on the basis that previous efforts to deter theft, such as signs that display images of CCTV cameras, threats to prosecute offenders, bag checks, checkout weighing plates and electronic security gates have not been effective and the next frontier is a very close-up video selfie to enjoy as you scan your goodies.

Smart Company reported on the introduction of self-surveillance tech last year, explaining the psychology of surveillance as a deterrent against theft. How much a person steals comes down to their own “deviance threshold” — the point at which they can no longer justify their behaviour alongside a self-perception as a good person.

The supermarkets’ strategy of self-surveillance provides a reminder that we are being watched, which supposedly evokes self-reflection and self-regulation.

This all sounds reason enough. Who can argue with the notion that theft is bad, and we must act to prevent it? We might also recognise the supermarkets’ business process excellence in extending self-service to policing.

Coles argues that they provide notice of the surveillance via large posters and signs at the front of stores. They say that the cameras are not recording, and they claim that the collection of this footage (what collection – if no record is being made?) is within the bounds of its privacy policy (last updated November 2018).

At the time of writing this blog, the Coles privacy policy makes no mention of video surveillance or the capturing of images, though it does cover its use of personal information for “investigative, fraud, and loss prevention” activities.

Woolworths has also attracted criticism over its use of the same software, which it began trialling last year. Recent backlash came after Twitter user @sallyrugg called on the supermarket to please explain any connection between the cameras, credit card data and facial recognition technology it employs. Like Coles, Woolies says no recording takes place at the self-serve registers and that the recent addition it has made to its privacy policy regarding its use of cameras pertains only to the use of standard CCTV in stores.

So it would appear the supermarkets have addressed the concerns. No recordings, no data matching, covered by privacy policy. And my personal favourite: choice: “If you do not wish to be a part of the trial, you are welcome to use the staffed checkouts.

But these responses are not sufficient. Firstly, there is no real choice in relation to the cameras when a staffed checkout is unavailable. Secondly, our notice and consent models are broken, which overstates the actual power granted to consumers by privacy policy. We don’t read them, and even when we do, we have no bargaining power. And lastly, the likelihood of function creep is high. It is not a stretch to imagine that the next step in the trial will be to pilot the recording of images for various purposes, and it could be navigated legally with little constraint.

On a final note, this experience reflects many of the challenges in our current privacy framework including: the balance of consumer interests against commercial interests, the strain on current consent models, and even the desire for a right to be forgotten.

Thankfully, these issues are all being contemplated by the current review of the Privacy Act (read our ongoing blog series on the review here). We need these protections and structures in place, to create a future in which we milk buyers can be free and anonymoos.

Photo by Ali Yahya on Unsplash

Privacy in focus: A new beginning

Welcome to our new blog series, “Privacy in focus”. With the review of the Privacy Act currently under way, this series will outline and explain the key concepts under the microscope, and explore solutions to current privacy challenges. In this first post, we outline what you can expect from the series.

The notion that we must improve the protection of privacy in the digital age has universal appeal. In a highly polarised world, where consensus rarely feels within reach, that is no small thing.

For all its conveniences, the rapid and widescale digitisation of our economies has contributed to an environment in which individuals frequently find themselves vulnerable to abuses of their personal information. In the face of these dangers, established regulatory approaches and business strategies for data handling and protection fall short of what’s required to engender widespread trust.

So it’s little wonder that Australia, like jurisdictions around the world, is looking more intently at its privacy framework, via a comprehensive review of the Privacy Act. The Act once provided a solid foundation for privacy protection, but unprecedented technological change and the aforementioned threats to privacy invite a closer look at its operation and objectives.

In this series, our goal is to look more closely at the specific questions and concepts being considered by the review. We want to foster a deeper understanding of why these foundational concepts are fundamental to “good privacy” and the reform options being contemplated in light of the realities of the modern economy.

Among the topics we will dive into are:

  • The definition of personal information
  • The durability of concepts like notice and consent
  • Organisational accountability for privacy
  • A direct right of action for privacy
  • The privacy of children and vulnerable people
  • The validity of today’s exemptions to the Privacy Act

We may uncover further topics as the series develops.

In our travels as practitioners, we know many businesses are committed to the privacy of their customers. However many fall afoul of the gap opening up between the need to pursue data-driven business strategies for competitive reasons and regulatory frameworks that aren’t fit-for-purpose for this digitally-driven economy.

Our hope is that the reform process goes some way to closing this gap, for the benefit of individuals and businesses. And that, through this series, we will support a deeper understanding of the key issues and possible ways forward by policymakers, legislators, practitioners and consumers.


Read all posts from the Privacy in focus series:
Privacy in focus: A new beginning
Privacy in focus: Who’s in the room?
Privacy in focus: What’s in a word?
Privacy in focus: The consent catch-22

Privacy, COVID and vulnerable communities in the golden city

elevenM’s Cassie Findlay brings a first-hand account of how privacy considerations are playing a role in shaping COVID-19 outcomes in parts of the US. 

It’s no secret that the city of San Francisco and the surrounding counties that make up the Bay Area are home to some of the most stark inequities in the world.  

Having just returned home after four and a half years living and working there, I can confirm that the evidence of staggering wealth existing side by side with extreme poverty and homelessness is everywhere, and it is shocking. Encampments dot the city streets in which people are lacking in basic sanitation and medical services. Solutions are often temporary and deployed only in response to residents’ complaints.  

Bringing a pandemic response into this mix was never going to be easy. The local and State governments’ response to the COVID crisis has, by overall US standards, not been too bad, but not necessarily for its most vulnerable people.  

A case in point can be found in the axing late last year of a testing program offered by the Google affiliate Verily, by the cities of Oakland and San Francisco. Introduced in March, the platform screens people for symptoms, books appointments, and reports test results. Unfortunately, from a privacy perspective, the design of the program added friction to the uptake of critical services in a pandemic.

In a letter to the California Secretary of Health, the City of Oakland’s Racial Disparities Task Force raised concerns about the collection of personal data on the platform amidst a crisis of trust amongst Black and Latinx communities in how their personal information might be used or shared by governments and corporations. Participants were required to sign an authorisation form that says their information can be shared with multiple third parties involved in the testing program, including unnamed contractors and state and federal health authorities. 

As explained by the Electronic Frontier Foundation’s Lee Tien to local public radio station KQED: “While the form tells you that Verily may share data with ‘entities that assist with the testing program,’ it doesn’t say who those entities are. If one of those unnamed and unknown entities violates your privacy by misusing your data, you have no way to know and no way to hold them accountable.”  

Given the need for better and more accessible testing for people experiencing homelessness, and the known severity of the impact of COVID on Black and Latinx communities, obstacles like this to testing uptake are concerning. Other testing services in Oakland and San Francisco have fortunately adopted approaches based on more direct engagement and building of trust in these communities, as opposed to defaulting to an app-based solution with the trust and privacy concerns that entails.  

This case shows just how much trust issues around the use of personal information can affect critical services to vulnerable communities, and it has valuable lessons for those of us working on the delivery of public services with technology. 

My key takeaways are: 

  • Consumers understand and take seriously the trade-offs involved in exchanging personal information for services, discounts and other benefits. 
  • We are moving beyond approaches to data collection that treat consumers as a homogenous group in terms of their willingness to share, but we can safely assume that unknown secondary purposes for their data will be always be regarded with suspicion. 
  • Success will increasingly depend on having a more nuanced picture of your ‘customers’, including their trust in your organisation or sector, whether it be commercial enterprise or public health services. 
  • Building a data governance strategy that can track and maintain a picture of your business, actors within the business including end users or customers, and evolving requirements — including less tangible ones like societal attitudes  is a great foundation for privacy policy and practice that respects diversity and can evolve as the landscape changes around you.

 

What non-mask wearers teach us about security awareness

elevenM Principal Arjun Ramachandran explores why observance of coronavirus restrictions and advice varies across countries and societies, and the potential lessons for those in the game of persuading people to adopt good security behaviours.  


“Wear a mask”. “Practice social distancing”. “Isolate”.

Clear, consistent, universal.

But cast your eyes from country to country, even community to community, and you see incredible variance in how the advice sticks.

The management of COVID-19 in the community highlights a core challenge in how companies cultivate positive security and privacy behaviours among their people. Clear guidance and engaging messages alone don’t always get the job done.

As public health practitioners have learned through the pandemic, and as those of us engaged in security and privacy persuasion must recognise, we work in a broader context.

The fingerprints of culture are evident in how different societies are responding to coronavirus guidelines and restrictions. Values like individualism, community, mutual obligation, respect for the elderly and deference to authority – and the extent to which they dominate a culture – clearly influence how communities behave, and how they will respond to advice and guidance.

“Maybe we’ll change our culture so that it’s not expected or brave of you to go to work sick. Maybe we’ll start to protect each other the way Asian cultures do. It’s pretty normal in Asian societies to wear a mask when you’re sick when you go out in public and to stay home if you can. We are the exact opposite. We wear masks to protect ourselves and we feel free to show up at a meeting when we have a fever.”
VICE

Sure – when you’re trying to inculcate good security or privacy practices, repeatedly broadcasting actionable advice will get these messages onto the radar of employees. Heck, if you’re clever enough to make the advice funny or entertaining, it might even go viral! You’ll have smashed a bunch of internal engagement metrics and hit some awareness goals.

But as with “Wear a mask!”, lack of awareness isn’t always the barrier. People can know what to do and still act contrarily. Or, they might follow the rules, but only in circumstances where compliance is monitored or defined.

If we want go beyond compliance, and if we want behaviours to be both lasting and self-applied across contexts, then our goal must be for employees to internalise and identify with those desirable behaviours.

That’s why we encourage organisations embarking on security or privacy education activities to look at shaping culture as a vital complement (if not a precursor) to their education and awareness activities.

Culture is ultimately an expression of shared values and beliefs expressed through collective behaviours and practices.

Research tells us that values, more specifically an alignment of values, creates conditions for people to internalise behaviours.

Yet while organisations abound in discrete bits of security advice (“don’t click this, make sure you do that”), the values underpinning the desired security and privacy behaviours are often never defined or articulated with employees. It could be as simple as revisiting the company’s existing set of corporate values and expressing how security or privacy are integral to that value set.

For staff to identify with values and desired behaviours, they will also expect to see them being exhibited and advocated by those they admire or respect. This is where an organisation’s high-profile security champions can play a role, and where its most senior leaders have a responsibility.

For more on security culture, check out our recent work.