Technology

Balancing act: Why Tech Will Always Need the Human Touch

4 Mins read

From the role of technology in the rapid development of Covid vaccines to the dramatic growth in climate tech, innovation is regularly held up as the answer to many of our most significant challenges. Yet we are also quick to blame technology when it doesn’t deliver.

We can’t have it both ways. The reality is human intervention, understanding and judgement are responsible for both the successes and apparent failings of technology. The key to unlocking the full potential of technology is striking the right balance.

If we design, develop and deploy technology without the right combination of human perspectives and understanding woven in, we risk building something with poor user experience and inappropriate functionality—as well as potentially jeopardizing cyber security and resilience. And when we place too much blind faith in technology, we abdicate responsibility and risk leaving our skills and judgement to atrophy and ethical issues to arise.

“Pushing the boundaries of what technology can deliver is essential to our progress. But staying in control is critical,” says Chris Oxborough, technology risk leader at PwC. “Technology is only as good as the people who develop it and monitor its outcomes. But it is also flawed, because people are imperfect.”

If organizations don’t factor human imperfection into their strategy for getting the most out of technology, when mistakes inevitably happen, the consequences last longer, and hit harder, than they need to. And any investment is quickly undermined.

We see this starkly in cyber security, where criminals prey on the susceptibility of people to being fooled by anything from unsophisticated phishing emails to high-tech heists, such as the Hong Kong bank manager tricked into authorising a £26m transfer by a deep fake “client” phone call. Such scams will only get more advanced and their impact more troubling if organizations allow the gap between technology capability and human understanding to widen.

See also  Big banks are set up to ‘kill’ change, says founder of $2.5 billion fintech firm Monzo

In another example, the Colonial Pipeline ransomware attack, which shut down the supplier of half the US East Coast’s fuel for five days in May 2021, was carried out using a single password, entered into an old VPN which most of the company didn’t know was still in use. The fact that VPN was still active was an oversight.

PwC research shows organizations are increasingly concerned about ransomware, with 61 percent expecting such attacks to increase. However, the same research reveals progress on improving employee engagement with security issues and addressing human fallibility—80 percent of organizations report more top-down engagement from CEOs on cyber security, and 77 percent report a reduction in the rate of employees clicking on phishing tests.

As technology becomes more powerful, the need for greater vigilance and understanding is a pressing issue. And as that power grows, it means we have a greater responsibility to ensure we understand its capabilities and associated risks. That means avoiding skills atrophy.

It is an issue that surrounds us. Among those who spent months communicating virtually during lockdowns, some report that in-person interaction and communication now feel more difficult. We are all also no doubt familiar with instances of drivers faithfully following the recommendations of a GPS device and ending up in situations their own instincts and judgement could have avoided. Taking that a few steps further, in aviation, concerns persist about the impact of autopilot technology on the flying skills and instincts of pilots.

These examples are indicative of a trend spanning almost every industry, where decision-making is complemented, and sometimes supplanted, by technology. That dynamic doesn’t need to create risk, as long as we are aware of its potential to do so.

See also  Microsoft Inks One of the Largest-Ever Permanent Carbon Removal Agreements

“Technology can improve our decision-making and can give us incredible insights. But we must not let its power become the reason our own skills and judgement begin to dim,” says PwC’s Oxborough. “The more powerful technology becomes, the sharper and smarter we must become to head-off risk and get the most out of it.”

Nowhere is the need to stay in control of the risk and reward of technology greater than with the increased application of artificial intelligence. Without proper guidance, even the most advanced artificial intelligence will struggle to produce the right result. As tech author and Google X alum Mo Gawdat noted in his 2021 book Scary Smart: “If you don’t know [what you want], then the machines won’t know what you want either.”

Lola Evans, a technology director at PwC UK, says: “Technology can be a great ally. But the design and development of any technology must be guided by a clear vision and understanding of what it is we’re looking to achieve, how we’ll do it, and what success will look like. If that doesn’t happen we run the risk of bias in our algorithms, poor user experiences, misplaced trust in our outputs and a snowball effect when we design new technologies based on earlier iterations that were flawed.”

“If we’re not confident in what we’re asking of the technology, we’re going to be really bad judges of the outcomes,” adds Evans.

As organizations look to artificial intelligence to sift CVs, or process loan and credit card applications, the risk of unintended consequences raises its head, in ways that could be discriminatory and damaging to an organization’s reputation. UK financial regulators have already warned banks they must ensure automating loan application approvals doesn’t perpetuate bias. To move at speed, organizations should anticipate such objections and ensure they address them proactively, with a human-led approach.

See also  Tech in Africa: Why Silicon Valley Should Pay Attention

“We have to strike a balance,” says Evans. “Artificial intelligence can provide the speed and scale we need in a great many processes, but we cannot abdicate responsibility. We need people seeking out potential issues and owning the accuracy and implications of the outputs. And those people must bring a diversity of experience and perspective, or we risk replicating the homogeneity that resulted in past bias, just at an industrial scale.”

There are myriad risks related to technology that only really become issues if organizations lack the ability to identify them and respond effectively.

“It’s about mitigating risk sufficiently, but not to the point where it puts the brakes on innovation,” says PwC’s Oxborough. “Quite the opposite. It’s about knowing the brakes are there, that you understand how they work and how they can be applied, so you can move at speed, with confidence, as you innovate.”

How do we maximise benefits of technology whilst mitigating risks? Join us on May 19 for a virtual panel discussion, hosted by WIRED, to hear from The LEGO Group’s CDO, the CEO of R² Factory, Rolls-Royce & PwC’s technology risk leader on how to strike the right balance. Join them here

Source:wired.co.uk

Related posts
Matters ArisingTechnology

Microsoft Bars Employee Access to Perplexity AI, Impacting Azure OpenAI's Major Customers

1 Mins read
Microsoft Bars Employee Access to Perplexity AI, Impacting Azure OpenAI’s Major Customers Microsoft is blocking employee access to Perplexity AI, one of the…
Artificial IntelligenceMatters ArisingTechnology

TikTok Ban Clears Significant Hurdle, While Perplexity AI Continues to Revolutionize Search

1 Mins read
TikTok Ban Clears Significant Hurdle, While Perplexity AI Continues to Revolutionize Search Well, if you are a big TikTok fan and live…
BusinessfinanceHeadlineTechnologyTelecom

Deloitte Unveils Exciting Technology, Media, and Telecom Forecasts for 2024

5 Mins read
Deloitte Unveils Exciting Technology, Media, and Telecom Forecasts for 2024 Deloitte, a leading global provider of audit and assurance, consulting, financial advisory,…

Leave a Reply

Your email address will not be published. Required fields are marked *