First published on the CBI Website.

In our quest for ultimate efficiency, have we lost sight of quality? This is a question I’ve been wrestling with since considering the level of automation that should be delivered by AnyGood? – a platform where professionals recommend other professionals for roles.

Every day I see new solutions launching with increasingly complex technical solutions. Whether it be those that automatically match candidates for roles, robots interviewing candidates or AI predicting the performance of someone in 5 years’ time.

We could quite easily layer on more and more technology. Delivering an end-to-end process where there is just a human at the beginning providing input and one at the end receiving an output. But I’m increasingly questioning whether this is the best approach. There’s something about pure technology solutions that doesn’t sit right with me when it’s a very human process. It’s about what we lose when introducing this level of automation rather than what we gain.

In research we commissioned last year into trust in recruitment, we found that over 90% of people wanted the jobs search industry to stop using computer algorithms and rely more on real human beings. This included both candidates and hiring managers alike. It became very clear that trust was key in order to ensure mass adoption of a high-tech solution – and platforms need to prove they are worthy of that trust.

One way we’ve found to get the balance right, as highlighted by Rachels Botsman’s work on trust, is to include moments of friction – something that slows down the process and recognises the value in being able to think about our decisions. This also then has a knock-on effect of reducing what can be seen as a black box solution, into one where the key steps are more transparent and you feel as though you have a greater level of control.

The interesting thing about these friction points is that they can also provide a business with new opportunities. A moment to engage, to communicate and perhaps build a deeper relationship. In our case, it allows us to provide a member of our network with the opportunity to reflect on why they might recommend someone for a role, ensuring it’s for the right reasons. To us, this golden nugget of value cannot be provided by a technology solution alone.

We’ve learnt from the experience of hyperlocal social network Nextdoor, which – as members discuss how to make their neighbourhoods better – was accused in an article by news site Fusion of becoming a home for racial profiling. They’ve since added in steps to encourage members to stop and think before posting. This, and posing questions such as “Ask yourself – is what I saw actually suspicious, especially if I take race or ethnicity out of the equation?” has reduced posts containing racial profiling by 75%.

Ultimately, the key is finding a balance, where each process step or moment of the journey is being handled in a way that is optimum and the end-to-end journey is considered in terms of whether it allows people to trust the decisions they are making within it. In my experience, the focus for businesses has been that “optimum” equals a reduction of human intervention. The recognition that we may as a result be impacting the quality of output, reducing potential adoption due to a lack of trust or even allowing bias to flourish is, I suspect a significant shift in thinking.