elevenM collaborates with IPC NSW for PAW 2022

elevenM is excited to be collaborating with the Information and Privacy Commission NSW (IPC) for Privacy Awareness Week 2022 to help NSW government agencies in the management of privacy risks. 

We’ve partnered with IPC on the development of a new Privacy Impact Assessment (PIA) questionnaire that NSW public sector agencies can use to assess their websites for privacy risks. By using this tool, NSW Government agencies can draw on industry best practices to more efficiently assess privacy risks and identify remediation actions. 

The new IPC tool draws from elevenM’s PIA and privacy tooling suite. Anyone that’s done a PIA will know that PIA tools come in many shapes and sizes.  

There are tools for specific industries and business contexts and for individual jurisdictions and legal frameworks. Some tools function primarily as questionnaires, others offer guidance and recommendations. There are tools designed to be used by privacy experts, while others bake-in expert knowledge so they can be used by anyone in the business. 

elevenM’s privacy experts have worked with all these kinds of PIA tools, using them with many business clients and in a variety of contexts. We’ve drawn on this collective experience to create a library of PIA and privacy tools that is most useful and practical. 

If you’d like more information about our PIA tools, please contact us at hello@elevenm.com 

For more information about our collaboration with IPC, please refer to their PAW 2022 webpage.

When it’s all by design

elevenM Principal Arjun Ramachandran reflects on the explosion of “by design” methodologies, and why we must ensure it doesn’t become a catchphrase.

Things catch on fast in business.

Software-as-a-service had barely taken hold as a concept before enterprising outfits saw opportunities to make similar offerings up and down the stack. Platform-as-a-service and infrastructure-as-a-service followed swiftly, then data-as-a-service.

Soon enough, the idea broke free of the tech stack entirely. There emerged CRM-as-a-service, HR-as-a-service and CEO-as-a-service.

“As a service” could in theory reflect a fundamentally new business model. Often though, simply appending the words “as a service” to an existing product gave it a modern sheen that was “on trend”. Today, you can get elevators-as-a-service, wellness-as-a-service and even an NFT-as-a-service.

A few days ago, I came across a hashtag on Twitter – #trustbydesign – that gave me pause about whether something similar was underway in an area closer to home to me professionally.

For those in privacy and security, the “by design” imperative is not new. Nor is it trite.

“Privacy by design” – in which privacy considerations are baked into new initiatives at design phase, rather than remediated at the end – is a core part of modern privacy approaches. In a similar way, “secure by design” is now a familiar concept that emphasises shifting security conversations forward in the solution development journey, rather than relegating them to bug fixes or risk acceptances at the end.

But could we be entering similar territory to the as-a-service crew? For those involved broadly in the pursuit of humanising tech, on top of privacy by design and secure by design there are now exclamations of safety by design, resilience by design, ethical by design, care by design, empathy by design and the aforementioned trust by design.

Don’t get me wrong, I love a good spin-off. But as we continue to promote doing things “by design”, it’s worth keeping an eye to its usage and promotion, so it doesn’t become a hollow catchphrase at the mercy of marketing exploitation (for a parallel, see how some security peeps are now vigorously standing up to defend “zero trust”, a security approach, against assertions that it’s “just a marketing ploy”).

Doing things “by design” is important and valuable. It speaks to a crystalising of intent. A desire to do things right, and to do them up front. In fields like privacy and security, where risks have historically been raised late in the piece or as an afterthought (and sometimes ignored as a result), the emergence and adoption of “by design” approaches is a welcome and impactful change.

As “by design” catches on as a buzzword, however, it’s vital we ensure there’s substance sitting behind each of its variants. Consider the following two examples.

Privacy by design
Privacy Impact Assessments are a rigorous, systematic and well-established assessment process that provides structure and tangible output to the higher intent of “privacy by design”. Regulators like the OAIC endorse their use and publish guidance on how to do them. At elevenM, we live and breathe PIAs. Whether undertaking detailed gap analyses and writing reports (narrative, factual, checklist based, metric based, anchored to organisational risk frameworks, national or international), training clients on PIAs or supporting them with automated tools and templates, we’re making the assessment of privacy impacts – and therefore privacy – easier to embed in project lifecycles. 

Ethics by design
The area of data ethics is a fast-emerging priority for data-driven businesses. We’ve been excited to work with clients on ways of designing and implementing ethical principles, including through the development of frameworks and toolkits that enable these principles to be operationalised into actions that organisations can take to make their data initiatives more ethical by design.

At a minimum, a similar structured framework or methodology should be articulated for any “by design” philosophy.

A final consideration for businesses is the need to synthesise these “by design” approaches as they take hold. There’s some risk that these various imperatives – privacy, security, data governance, ethics – will compete and clash as they converge at the design phase. It’ll be increasingly vital to have teams with cross-disciplinary capability or expertise who can efficiently integrate the objectives and outcomes of each area towards an overall outcome of greater trust.

We leave the closing words to Kid Cudi: “And the choices you made, it’s all by design”.

If we can help you with your “by design” approaches, reach us at hello@elevenm.com

Photo by davisuko on Unsplash

Has the cookie crumbled?

elevenM’s Chaitalee Sohoni dives into the what and why of third-party cookies, Google’s plan to phase them out and what this means for businesses and individuals alike.

By 2023, Google Chrome will phase out support for third-party cookies as part of its Privacy Sandbox Initiative with Stage 1 set to start by late 2022.

Google first announced its intention to eliminate third-party cookies from its Chrome browser in early 2020 and made it explicit that they ‘will not build alternate identifiers to track individuals as they browse across the web‘.

If you have been on a website in the last couple of years, you might have encountered an annoying pop-up inviting you to read the company’s ‘cookie policy’ and review your cookie preferences. Chances are you clicked ‘agree’ without reading it and moved on to the content of the page, mostly because privacy policies are tedious to read. The cookie policy on any website is essentially notifying you that a cookie is downloaded to your computer to ‘enhance’ your browsing experience each time you visit the website.

But what exactly are cookies and how do they affect you?

A cookie is a piece of data in the form of small text files that are unique to each user. When you visit a new website, cookies are created to identify you and personalise your experience based on your browsing history.

While cookies aren’t bad, what we choose to do with them is problematic because it raises concerns about data privacy.

Cookies were invented by Lou Montulli in 1994 and have since been the backbone of internet browsing experience. Cookies are created to remember and recall information that is useful while browsing, such as log in information or the previous page on a website. Without cookies, browsing the internet would be an extremely frustrating process — imagine adding an item to your cart when you shop online, and having it disappear each time you go back to add more items. Think Dory from Finding Nemo.

There are two kinds of cookies: First-party cookies and third-party cookies. First-party cookies are created and downloaded from the primary website you are visiting.

Third-party cookies, however, are generated and saved on your computer by multiple websites whose information is embedded on the primary website you browse. For example, when you visit a website, it’ll most likely contain advertisements or images from other websites or even a Facebook ‘like’ button. Even if you don’t click on them, cookies from their websites are created and stored on your system.

If you have ever had an advertisement follow you around on the internet, it is because of third-party cookies. Based on the websites you visit, cookies gather a great deal of information about you such as your age bracket, gender, location, interests, personal preferences etc. Advertising companies use cookies to track your activity on the internet by building a profile of your interests based on your browsing history to send you personalised advertisements. Cookies allow companies to make more money by helping them find the right audience for their products. Platforms such as Facebook and Google are heavily incentivised to ensure advertisements from brands reach the targeted users.

With its Sandbox Initiative, Google aims to withdraw support for third-party cookies. At first glance, this move appears to be a step in the right direction for data privacy, but Google is a tad late to this party. Mozilla’s Firefox, Apple’s Safari and Brave blocked third-party cookies years ago, making them more privacy robust browsers. There’s also DuckDuckGo, a more secure search engine that also offers a browser for mobile phones.

Google may not be the first to ban cookies but Chrome is the most popular browsing platform with a global web browsing market share of 64.4% as of January 2022, which is significant when compared to Safari or Firefox, which only account for 16.9% and 3.9%, respectively. And so, Google’s plan to phase out cookies is a big deal in the world of internet.

With Google hopping on the bandwagon, does this spell the end for third-party cookies? Maybe. Does it mean that your browsing history won’t be tracked anymore? The answer is not that simple.

Eliminating third-party cookies does remove the power advertising companies have in terms of tracking individuals, but it places that power directly into Google’s hands. With Chrome not relying on third-party cookies to collect data about users, Google will no longer support companies in selling targeted web advertisements to individuals. This move will give Google an upper hand in collecting first-party data from users including collecting data from mobile applications to which the cookie ban doesn’t apply.

Google’s move will have a drastic impact on businesses and advertisers as they will need to rely heavily on first-party data or find alternatives to reach their audiences. In a joint statement, the Association of National Advertising and the American Association of Advertising Agencies have pointed out that ‘Google’s decision to block third-party cookies in Chrome could have major competitive impacts for digital businesses, consumer services, and technological innovation.’

Proposed legislative changes in this area will also have a bearing on businesses. In the review of the Privacy Act currently underway, one of the proposed changes includes replacing ‘about’ with ‘related to’ in the definition of personal information in the Privacy Act 1988. The purpose of this change is to explicitly bring more technical identifiers such as IP addresses or unique, persistent identifiers used in cookies within the scope of the Act. Under this new definition, unique identifiers are very likely to be considered personal information and this change will therefore have a bearing on the use of cookies by websites that depend on unique identifiers to track individuals.

Google initially wanted to replace third-party cookies with Federated Learning of Cohorts (FLoCs). FLoCs was designed to track individuals based on their web browsing to group them into cohorts that were defined by similar interests. However, in January this year, Google announced that it was replacing FLoCs with Topics. Topics is also built on the idea of interest-based advertising where the browser determines top interests for users based on their browsing history stating ‘it provides you with a more recognizable way to see and control how your data is shared, compared to tracking mechanisms like third-party cookies.’

Google is still exploring options to fulfil its promise to phase out the use of third-party cookies by 2023, a delay from its initial plan to phase them out by 2022. We may have to wait a little longer to see how third-party cookies will be replaced by Google.

[UPDATE: An earlier version of this post stated Google intended to replace third-party cookies with Federated Learning of Cohorts (FLoCs), however it has now opted to replace them with Topics.]

The enduring modern privacy challenge

Privacy has evolved considerably since ancient thinkers first wrote about it as a concept. In this post, elevenM’s Arjun Ramachandran plots this evolution and discusses how finding skilled people has become the enduring modern challenge for the privacy profession. We also preview elevenM’s upcoming webinar.

In his musings over privacy, Aristotle could hardly have contemplated the complex challenges that would be heralded by the digital future.

The ancient Greek philosopher drew a distinction between private family life (oikos) and the public sphere of political affairs (polis). While this conception of the division between domestic life and the state remains valid, the landscape today is so much more complex.

Our lives today play out and are made meaningful by our existence in an intricate web of relationships – with other people, with businesses and with information – largely mediated through our daily use of various digital services and participation in online platforms.

The modern privacy challenge – as distinct from Aristotle’s paradigm where privacy was more synonymous with domestic life – is perhaps to locate and define privacy within this more complex picture. In this context, our experiences of privacy are increasingly not simply the result of our personal choices (such as deciding to remain in the privacy of the family home). Instead, they’re dictated by how this broader digital and information ecosystem – one in which we all must necessarily participate – is engineered.

Banks, government agencies, streaming services, social media platforms … and all manner of services have now become, by default, digital businesses. So it is that the digital platforms we use to communicate and the organisations with which we share our information have become outsized gatekeepers of our privacy.

We know that there is strong community sentiment in favour of privacy which provides direction for businesses seeking as us customers. Regulations such as The Privacy Act (1988) and the EU’s GDPR also set legal baselines for how privacy must be protected. But realising these outcomes ultimately falls to these organisations and how they collectively handle our information.

In our work with many of these companies, we’ve seen growing intent and motivation over recent years to get privacy right. There has been steady investment in establishing dedicated privacy teams and building privacy processes and programs.

But when their privacy capability reaches a certain point of maturity, many organisations appear to hit the same wall: an inability to find skilled professionals to keeping doing the good work of privacy. “The global shortage of privacy professionals” has sadly become a well-documented state of affairs.

The challenge will only intensify in coming years as digitisation expands, with growing use of data-driven technologies like machine learning and artificial intelligence. This is to say nothing of the review of the Privacy Act currently underway, and the expanded compliance requirements it will bring.

At elevenM, we’ve been talking about this problem at length amongst ourselves and with some of our industry colleagues. We’d love to open up this problem to a broader audience – such is its breadth and criticality that we believe it requires a truly collective approach.

On February 8, elevenM’s Melanie Marks and Jordan Wilson-Otto, in partnership with the International Association of Privacy Professionals, will deliver a presentation diving deeper into the talent drought and exploring solutions. The presentation will be followed by a multidisciplinary panel discussion featuring leaders in privacy, academia and industry.

If you’d like to attend, click the link below to register. https://iapp.org/store/webconferences/a0l1P00000DbKYuQA


A Lawyer, a Chemist, an Ethicist, a Copywriter and a Programmer Walk Into a Bar

Presenters:
Melanie Marks, CIPP/E, CIPM, FIP, Principal, ElevenM
Jordan Wilson-Otto, Principal, ElevenM

Panellists:
Chantelle Hughes, Executive Manager, Technology Community, Commonwealth Bank
Sacha Molitorisz, Lecturer, Centre for Media Transition, University of Technology Sydney
Jacqui Peace, AU/NZ Privacy Management, Google
Daimhin Warner, CIPP/E, Principal & Director, Simply Privacy

Towards a safer online world for children and the vulnerable

elevenM’s Jordan Wilson-Otto shares findings from recent research on the privacy risks and harms for children and vulnerable groups.

Yesterday the Government initiated two consultations (on the Online Privacy Bill and the Privacy Act Review), both of which include a focus on better protecting kids. elevenM worked on key research that informed thinking behind these changes, and we are delighted to share the outputs of that research here.

Our research was commissioned by the Office of the Australian Information Commissioner and conducted in partnership with two leading academics from Monash Law School (Normann Witzleb and Moira Paterson). It provides an in-depth analysis of the privacy risks and harms that can arise for children and for other vulnerable groups online and makes recommendations for additional protections that could be put in place to mitigate these risks.

We’re proud of our contribution, and we hope it might serve as a useful reference for those drafting submissions to the Online Privacy Bill exposure draft and the Privacy Act Review Discussion Paper.

If you don’t have time to read the whole report, here are some of our key findings:

  • Children can be vulnerable online due to limitations in their basic and digital literacy, cognitive abilities and capacity for future-focused decision making.
  • Individual characteristics and situational factors shape susceptibility to harm, even for adults. Vulnerability is dynamic and contextual, and its causes are complex and varied. Identifying individuals who need greater protection is not always straightforward.
  • Children and other vulnerable groups face a wide variety of harms online. Mostly, these arise from monetisation of their personal information and the manipulation of their behaviour, but also from the social impacts of sharing personal information on their reputation and life opportunities, and e-safety risks. But it’s also important to remember that no environment is risk free, and participation, the right to take one’s own risks and the development of digital skills are also important goals.
  • Digital platforms have all adopted measures aimed at protecting children and other vulnerable groups. However, these are highly variable between platforms and are often difficult to navigate and limited in their effectiveness.
  • There is an international trend towards implementing additional privacy protections for children, with UK the most advanced, followed by the EU and the USA. We review enhanced privacy protections for children and other vulnerable groups across these jurisdictions.
  • Reliance on consent should be limited. In most cases with social media and digital platforms, even adults are not able to understand the conditions they’re agreeing to.
  • For everyone, but particularly for kids, privacy transparency should aim for more than mere disclosure of material facts, and should instead aim to educate, empower and enable privacy self-management in line with a child’s developing needs and capabilities.

Finally, we made a range of recommendations for additional protections that could be put in place either via an Online Privacy Code, or as part of the broader review of the Privacy Act in order to better protect our most vulnerable. Some of the key ones are:

  • To establish an overriding obligation to handle personal information lawfully, fairly and reasonably, taking into account the best interests of the child (where children are involved).
  • To place a greater onus on platforms to verify users’ age (taking into account any privacy risks arising from verification measures)
  • To strengthen requirements for consent, taking into account whether it is reasonable to expect that an individual understands what it is that they are consenting to.
  • To strengthen privacy transparency requirements, including requirements to collect engagement metrics for privacy notifications and privacy features, and to demonstrate that steps taken to ensure user awareness of privacy matters are reasonable.

We’re encouraged to see that the focus on better protecting kids online is already generating national and international headlines, and hope this research will play a role in steering us towards the right reforms. Ultimately, any new code or reforms to the Privacy Act must not only protect children and vulnerable groups from online risks, but also enable them to fully access and participate in the benefits of the online world.

If you would like to discuss this research in more detail, or would like us to assist you to understand the broader Privacy Act changes being considered, drop us a line at hello@elevenm.com.

PAW 2021 – That’s a wrap

Privacy Awareness Week is 5-9 May 2021.

In our final post for Privacy Awareness Week (PAW), we share our five highlights and observations from the week.

1. A privacy win during PAW

The timing may have been coincidental, but we’ll take it. There was a notable win for privacy this week with the Senate Committee that reviewed the Government’s inter-agency data sharing law – the Data Availability and Transparency Bill – recommending the bill not be passed in its current form, noting a need for stronger privacy protections and security measures (among other things).

Our advocacy for greater attention to the privacy risks in the bill (as part of a collaborative submission with other privacy colleagues) was quoted in the Senate Committee’s report and in the news media this week.


2. Momentum building

We were energised to hear this week just how much focus and attention there is on privacy, particularly from a regulatory perspective. At a panel of regional privacy regulators hosted by the International Association of Privacy Professionals on Tuesday, we got insight into the breadth of activity currently underway.

At the Commonwealth level, clearly the focus is on the review of the Privacy Act. The States and Territories are also running various projects to bolster privacy protections, from the privacy officers project in Victoria, mandatory breach reporting in NSW and privacy champions network in Queensland, to the focus on managing privacy in complex cross-cultural contexts in the Northern Territory.

Overseas, New Zealand is looking at improvements within its public sector, the Philippines will be launching a privacy mark and  Singapore is implementing its new data protection law.

Many of the regulators on Tuesday also expressed the view that it is time for everyday Australians to make privacy a priority and realise that every time we hand over our data, we’re not only making an individual decision but also contributing to the future fabric of our society.

3. Privacy spat!

What better way to draw attention to trust and transparency during PAW than a stoush between two technology platforms over privacy.

Signal and Facebook went at it after Signal used Facebook’s own advertising platform to create ads that exposed the categories Facebook uses to classify users. The ads appeared as placards and contained customised messages such as: “You got this ad because you’re a certified public accountant in an open relationship. This ad used your location to see you’re in South Atlanta. You’re into natural skin care and you’ve supported Cardi B since day one.”

Facebook labelled the move a stunt, while Signal claimed Facebook disabled its account as a response. Either way, fantastic timing for PAW.

4. Privacy is precious

Speaking of ads, our attention this week was drawn to New Zealand’s TV commercial for privacy, created to raise awareness of its new Privacy Act, which came into operation in December 2020. The ads feature the theme “Privacy is precious” and are at once simple to understand while being wonderfully evocative. Check it out here.

The Kiwis have a great track record of pumping out great videos to raise awareness – see the Air New Zealand air safety videos and New Zealand Government online safety ads. Perhaps it’s time to add “privacy advertisements” to the list of cross-Tasman rivalries, which already includes cricket, rugby and netball. Can Australian creatives take up the charge and create an even better pitch to help the Australian community prioritise privacy?

5. Hurray for privacy drinks

Finally, it was great to celebrate Privacy Awareness Week with an old-fashioned drink with friends and colleagues. elevenM hosted drinks at O Bar in Sydney on Wednesday night, and we were thrilled to be back together in person with so many of our valued friends, clients, partners, colleagues, and other fellow travelers in attendance.

It reminded us what a diverse and vibrant community we have and filled us with inspiration and optimism about the future, as we work together to solve some of the most complex issues of our time. Thanks to all who came, and we hope those that couldn’t will make it next time.