When it’s all by design

elevenM Principal Arjun Ramachandran reflects on the explosion of “by design” methodologies, and why we must ensure it doesn’t become a catchphrase.

Things catch on fast in business.

Software-as-a-service had barely taken hold as a concept before enterprising outfits saw opportunities to make similar offerings up and down the stack. Platform-as-a-service and infrastructure-as-a-service followed swiftly, then data-as-a-service.

Soon enough, the idea broke free of the tech stack entirely. There emerged CRM-as-a-service, HR-as-a-service and CEO-as-a-service.

“As a service” could in theory reflect a fundamentally new business model. Often though, simply appending the words “as a service” to an existing product gave it a modern sheen that was “on trend”. Today, you can get elevators-as-a-service, wellness-as-a-service and even an NFT-as-a-service.

A few days ago, I came across a hashtag on Twitter – #trustbydesign – that gave me pause about whether something similar was underway in an area closer to home to me professionally.

For those in privacy and security, the “by design” imperative is not new. Nor is it trite.

“Privacy by design” – in which privacy considerations are baked into new initiatives at design phase, rather than remediated at the end – is a core part of modern privacy approaches. In a similar way, “secure by design” is now a familiar concept that emphasises shifting security conversations forward in the solution development journey, rather than relegating them to bug fixes or risk acceptances at the end.

But could we be entering similar territory to the as-a-service crew? For those involved broadly in the pursuit of humanising tech, on top of privacy by design and secure by design there are now exclamations of safety by design, resilience by design, ethical by design, care by design, empathy by design and the aforementioned trust by design.

Don’t get me wrong, I love a good spin-off. But as we continue to promote doing things “by design”, it’s worth keeping an eye to its usage and promotion, so it doesn’t become a hollow catchphrase at the mercy of marketing exploitation (for a parallel, see how some security peeps are now vigorously standing up to defend “zero trust”, a security approach, against assertions that it’s “just a marketing ploy”).

Doing things “by design” is important and valuable. It speaks to a crystalising of intent. A desire to do things right, and to do them up front. In fields like privacy and security, where risks have historically been raised late in the piece or as an afterthought (and sometimes ignored as a result), the emergence and adoption of “by design” approaches is a welcome and impactful change.

As “by design” catches on as a buzzword, however, it’s vital we ensure there’s substance sitting behind each of its variants. Consider the following two examples.

Privacy by design
Privacy Impact Assessments are a rigorous, systematic and well-established assessment process that provides structure and tangible output to the higher intent of “privacy by design”. Regulators like the OAIC endorse their use and publish guidance on how to do them. At elevenM, we live and breathe PIAs. Whether undertaking detailed gap analyses and writing reports (narrative, factual, checklist based, metric based, anchored to organisational risk frameworks, national or international), training clients on PIAs or supporting them with automated tools and templates, we’re making the assessment of privacy impacts – and therefore privacy – easier to embed in project lifecycles. 

Ethics by design
The area of data ethics is a fast-emerging priority for data-driven businesses. We’ve been excited to work with clients on ways of designing and implementing ethical principles, including through the development of frameworks and toolkits that enable these principles to be operationalised into actions that organisations can take to make their data initiatives more ethical by design.

At a minimum, a similar structured framework or methodology should be articulated for any “by design” philosophy.

A final consideration for businesses is the need to synthesise these “by design” approaches as they take hold. There’s some risk that these various imperatives – privacy, security, data governance, ethics – will compete and clash as they converge at the design phase. It’ll be increasingly vital to have teams with cross-disciplinary capability or expertise who can efficiently integrate the objectives and outcomes of each area towards an overall outcome of greater trust.

We leave the closing words to Kid Cudi: “And the choices you made, it’s all by design”.

If we can help you with your “by design” approaches, reach us at hello@elevenm.com

Photo by davisuko on Unsplash

The need to look beyond cyber

elevenM Principal Pete Quigley explores whether a siloed mindset is constraining the value digital risk professionals can bring to organisations and their clients.

I was lucky in the early 2010s to be consulting into Australia’s financial services industry when AWS came to town. I saw first-hand the internal struggles between business and technology teams who wanted to adopt a cloud-first strategy and risk, privacy and security teams who felt they were giving away the keys to the castle.  

Based on my position at the time with PwC, I had a number of fireside chats with the technology risk team from APRA, Australia’s financial services regulator. APRA foreshadowed an impending situation in which institutions would become reliant on digital channels to service their customers, but would lack visibility into what individual services and vendors made up those channels.  

Fast forward a decade and most revenue producing digital channels leverage a multitude of vendors to provide critical online services. One such widely-used vendor who has been hitting the headlines recently is Akamai. 

Akamai provides a number of services to optimise and protect digital channels. The nature of these services requires that you allow Akamai to manage critical digital services like Domain Name System (DNS). For those unfamiliar with DNS, it acts as the phonebook of the internet and allows users to connect to websites using domain names such as elevenM.com, instead of IP addresses.  

DNS is commonly considered to be a fragile system. When there are errors in the use or updating of this phonebook, users can’t find websites. This was the case with Akamai recently, whose DNS failure led to a massive internet outage

When I am asked what elevenM does, I usually revert to our tagline of ‘specialist cyber, privacy and data governance professionals’. I say that because it is what people understand and can draw a line to specific services and, indeed, specific outcomes. Within elevenM, however, we talk in terms of digital risk – the risk our clients face when operating in a digital economy.  

The outage caused by a bug in Akamai’s DNS service was not cyber, privacy or data governance related. In fact Akamai was at pains to say the issue “was not a result of a cyberattack”, even though it had very little else to say about the root cause. 

But the issue still had a significant impact on the availability of the digital channels of a large portion of the internet, and thus on the trust and confidence of users of those services – which is arguably ultimately what our industry is about. 

So, is it time we stop talking about specific delivery-focused silos and start thinking in terms of the customer’s digital experience? To more holistically assess risks to those digital experiences and how we are effectively measuring and managing those risks?  

Rotting fish: The need to improve cyber culture

elevenM’s newest recruit Jasmine Logaraj shares her thoughts on improving the culture within the cyber security industry, and how that will help to defend cyber threats.

This week, I had the opportunity to attend The CyberShift Alliance’s discussion “Addressing workplace culture in the cyber security sector.” The CyberShift Alliance is a collaboration between several associations including ISACA SheLeadsTech, FITT, CISO Lens, AWSN, the Australian Signal Directorate, AustCyber, ISC2 and AISA, DOTM, EY and Forrester Researcher, with the goal of addressing culture change within security. This alliance formed from an earlier International Women’s Day event run by AWSN and ISACA.

The purpose of the discussion this week was to raise awareness of toxicity in the cyber security industry. Speaker Jinan Budge, Principal Analyst at Forrester, described the main reasons for toxicity in the industry as being lack of organisational support, ego, and low leadership maturity.

Poor workplace culture is preventing good talent from joining the industry and making it harder to retain it. It is hindering the quality of work and preventing us as a nation from tackling cyber threats in the most inclusive, collaborative and, therefore, the most optimum way.

I asked Jinan and the panelists during the Q&A session to elaborate on the idea of toxicity being a barrier to young talent. Panelist Jacqui Kernot, Partner in Cyber Security at EY, said the reason it was hard to hire good talent was not because of a shortage of professionals with STEM skills, but because the industry needs to become a better place to work.

As cyber security professionals, we need to make this industry a more exciting and happier place. When recruiting, employers need to consider not only whether the employees are properly skilled, but whether they are the right fit for a good workplace culture, and in turn, whether their company is worthy of such wholesome candidates. Knowledge can be taught. Personality cannot.

Another interesting point raised during the discussion was the inability to speak out about bad behaviour in the cyber security industry. Jinan surveyed her professional network and found that 65% of respondents voted it to be “career suicide” to speak up about workplace problems, highlighting a fear of potential punishment for doing so. 

Changing this consensus relies on us as cyber security professionals leading the way. As Jacqui pointed out: “the fish rots from the head.” It is not a HR problem, but something to be fixed at the leadership level and not denied or swept under the rug. If companies do not address these problems, they will continue to lose good talent, and in turn waste money, time, and effort, leaving them with fewer employees and a lessened reputation. Akin to our efforts to create a security-focused culture in our clients, at elevenM we believe good workplace culture similarly requires an effort to foster shared values through leadership and role-modeling.

I am grateful that there are individuals such as Jinan, Jacqui and James working in my industry who realise the importance of fostering a good workplace culture. With leaders like these, I remain hopeful for the future.

Pete was right (but just this time)

By Arjun Ramachandran

Pete’s a hardened security professional. He’s been in this game for over 20 years and has the battle scars of many cyber uplift and remediation programs. He feels the pain of CISOs fighting the good fight.

I’m a former journo. Even though I’m no longer actively reporting, the craft matters to me and I get defensive and outraged in equal measures about the quality of print news.

Pete and I butted heads in recent days over the reporting of the NSW Auditor-General’s (AG) report into Transport for NSW and Sydney Trains.

The AG report, and much of the media coverage, was overtly critical. A “litany of cyber security weaknesses [identified in] a scathing review” was how the SMH described it.

Pete wasn’t overly happy about the coverage, feeling both the media reporting and the AG report to be unfair, particularly in the context of how organisations actually go about improving their cyber posture (more on this later).

While I defended the reporting as – for the most part – being a fair and straight write-up of the AG report, I have to concede that on the bigger point Pete was dead right.

There’s a problem in our industry, and in broader society, with how we often talk about and respond to reviews of organisations’ security programs. It’s not that we’re quick to talk about deficiencies that don’t exist or that aren’t serious, but that the way these reviews are presented drives a response and conversation that is counterproductive to good security outcomes.

There are no perfect security programs, or organisations with perfect security. The threat landscape changes daily and security uplifts require complex whole-of-organisation changes that necessarily take a long time. Even in the most mature organisations, “good security” is a work in progress, with gaps continuously identified and addressed. Any detailed review will find holes and inadequacies.

To be clear, this is not an argument to take lightly the adverse findings of a review. Arguably, these findings are a large part of why we do reviews, so that a measured and constructive response to them can lead to improved security.

But too often in our journeys at elevenM we see instances where the supercharged or out-of-proportion delivery of adverse findings leads to an equally supercharged response (sometimes in the form of its own large remediation program) that sees a sizeable redirection of resources, and ultimately the deferral or interruption of well-considered strategies or critical uplift projects.

We found it particularly uncomfortable that the AG report was based on a red team exercise. A red team exercise – where “authorised attackers” (in the words of the AG report) are given permission to try and break into systems – will always find security flaws. These exercises are typically conducted expressly to provide insights to security teams that they can learn from. To broadly publish those findings in the tone and manner that the AG report has done didn’t strike us as constructive.