Early stopping: how to avoid the AI proof-of-concept trap

Too many AI projects fail Proof of Concept testing due to flaws that could and should have been foreseen. Here’s how to screen your projects beforehand – saving time, energy and money

AI’s long-term potential is vast and exciting. It’s why many companies have no problems generating ideas for new AI projects. Yet within this lies a paradox. Although organisations are rarely stuck for ideas, so few projects succeed because so many ideas are pursued.

But the ‘PoC crisis’ or ‘PoC trap’ – in which resources and money are wasted on ultimately fruitless PoC testing of AI projects – can be sidestepped. What’s needed is a screening process.

Be rigorous

One way of avoiding the PoC trap is to be more rigorous in how you discover new opportunities. Embrace ideation workshops wholeheartedly, as free thinking without boundaries has definite value at this stage, but find the balance; without some framework these sessions can become directionless and wasteful.

Once you have your ideas, it’s time to be clinical. Vetting them is an important next stage – because 90% of projects don’t warrant being taken to PoC. How do we identify the precious 10% that should?

Learn from venture capitalists

Imagine a venture capitalist (VC) taking a cursory glance at 25 different pre-seed pitch decks, picking one and then hiring a law firm – for serious money – to do a deep due diligence analysis of that one company.

Be rigorous and ruthless in your assessment of any potential PoC candidate

Of course, this simply wouldn’t happen in that world – but it’s exactly what is happening in AI, with many projects being rushed through to PoC before undergoing any real scrutiny.

Without first considering the strength of a company’s management team, its products and market, an investor wouldn’t start any deep analysis. The same rules apply for portfolio management in early-stage drug development.

Due to the hit-driven nature of these industries, both have developed rigorous processes for screening opportunities before jumping into a significant investment. It’s high time AI innovation adopted this point of view too.

Kill your darlings

For writers, ‘killing your darlings’ means identifying and erasing characters, storylines or features for the greater good of the story. It’s a process we in AI innovation could learn from. We need to pinpoint those flawed ideas and eliminate them, before they hit PoC.

First, let’s nail down what a PoC is and what we want it to deliver. Running a PoC is essentially one step of a screening process, focused on determining technical or methodological feasibility. But there are other screening checks you can run beforehand. They’re cheaper, but they’re also less exciting – and therein lies the problem. Data scientists and AI experts find PoCs alluring. They want to ask: “Could this amazing project work?” – when what they should be asking is: “How can I confirm – as cheaply as possible – that this project is worthwhile?” 

Three pre-PoC screening steps

Step 1: Estimate the total cost

Starting with costs, not benefits, may sound counter-intuitive. But we are not here to sell a project; we are here to eliminate those that are not worth pursuing, as quickly as possible. And for that purpose, costs are great.

In a technology as experimental as AI, benefits are often uncertain. So any investment needs a leap of faith. Even if these hypothetical benefits far exceed costs, in practice there is a limit on how much can be invested in pursuit of a speculative ROI.

Costs, on the other hand, are real, and can be estimated without going into too much detail; they give you a first, rather discriminative, hurdle for your project to clear before it can advance to the next stage.

Here’s a rough breakdown of how to evaluate running and one-off development costs:

One-off development costs

  • Proof of Concept: How much would it cost to do a PoC? Plan for two to six months with a team of two to four people, plus at least half a full-time equivalent (FTE) subject matter expert.

  • Full-scale development: Don’t fool yourself by thinking that the PoC will take you some way towards full-scale implementation. Unless the AI system has the shape of a microservice or Excel plug-in, think of implementation as starting a complex software development project from scratch. This typically means an investment of at least €150,000, and can easily tip into the low millions.

  • System integration, training and roll-out: More often than not, rolling-out an AI system comes with a bit of change management. Replacing a human task with a computer requires behavioural shifts and new responsibilities for the people involved. Plan for at least four weeks of high-touch and high-attention, with at least one person 50-100% dedicated. 

Running costs

  • System maintenance: It’s important to bear in mind here that any AI system will be highly customised to your organisation. Therefore, the cost of maintaining it cannot be split over several organisations. Plan for 10-30% of initial development costs to make sure you can maintain your systems and address ongoing software ageing.

  • Operations: An AI system will usually require ongoing operational monitoring. It can also require ongoing data curation work. If the system is critical to your organisation, you may also need to design certain operational capabilities to ensure high availability.

  • External data licences: If your system depends on data from elsewhere, now’s the time to do some sensitivity analysis. Ask: “How do the total costs change if the price of external data increases by 100%?” This should be factored into your calculation. 

The purpose of this costing exercise is to understand whether the costs are in the right order or magnitude: for example, one-off/running costs of 120k/20k, 250k/50k or 2000k/200k. This level of precision tends to be enough for the purpose of a quick screening.

When you’re doing your calculations, a major component will be estimating FTE staff costs. For back-of-an-envelope figures I’d recommend basing this on the day rates of external providers. This comes close to full costs including overheads.

Now you have costs on the table, you need to check with your sponsors whether an investment of that size would be feasible. If you get a yes, it’s on to the next stage. (And congratulations! You’ve already cleared a hurdle that the majority of PoC use cases will face – and fall at – further down the trail.)

Step 2: Validate the process

Another point that’s often glossed over is the exact “use case of the use case”. How does the AI system fit with the other manual or automated processes that precede and follow it?

Let’s say your AI system is a predictive algorithm. A common mistake is to consider an algorithm’s ability to predict as the final measure of success. But the crucial missing step is what comes next. How does a prediction translate into action? How does an insight lead to follow-up? And to what extent can that be automated?

Take the example of a bakery. A prediction model for the number of loaves that will be sold the following day may seem to make a lot of sense. It would reduce food waste, oven time and lots of time spent kneading dough. But look closer and you discover the bread production process starts three days before. Being able to predict tomorrow’s bread purchases is now, alas, useless. And predicting bread consumption three days in advance may well be a lot harder.So when it comes to screening your AI solution, it’s useful to ask these two questions:

  1. What happens before using the new system?

  2. What happens afterwards, based on the output of the system?

It’s only once you map out each step of the way in detail that you can estimate the real value of your solution. Be clinical and critical when mapping these processes. Without clarity on this, any benefit estimation is mere hand-waving. If any vagueness occurs, it’s not necessarily the end of the project, but simply an opportunity to go back to the drawing board.

Are your subject matter experts or process owners still excited to have the AI system embedded in their workflows, now they know the minutiae of what that entails? Only then should you proceed to the next step. And make sure to get their quote: it will strengthen your case!

Step 3: Find your sponsors

Even the best systems and processes won’t bear fruit if they don’t find a home within your organisation. To set up that home, you need three things. Or, more accurately, three people: 

  1. Someone to bear the initial costs

  2. Someone to bear the running costs

  3. Someone to own the maintenance and operations of the system once it’s in place.

In the ideal scenario, you are the sponsor; initial and running costs come out of a budget you own. Owing to the relative infancy of AI, however, innovation and adoption tends to be driven not by business, process or product owners in organisations, but by cross-cutting initiatives, for which securing others’ buy-in is a key issue.

To nail down that commitment, you’ll need to bring more to the table than costs, of course. But even under the assumption of a positive ROI, there will be varying degrees of excitement, and hidden complexities to be discovered. So it’s better to start the conversations – and discover the fault lines – sooner rather than later.

Only the very best ideas should advance to PoC testing

Point 3 – the operational maintenance of AI systems – is still new territory for the majority of tech organisations, and most AI systems are more complex to run than a normal IT system. On top of this, if your organisation doesn’t already have an operational model for AI, you should expect hiccups along the way.

I wouldn’t say you need an official commitment from your three sponsors at this stage; it would be hard to get this without concrete numbers and systems. But be sceptical throughout your conversations and trust your gut feeling. Being able to have all sponsors lined up makes your case a lot stronger.

Now, is your project ready for PoC?

If you’ve made it this far then you’ve already cleared many of the obstacles other projects will face months down the line. (Even if your project is now out of the running, you’ve saved money, time and energy by identifying its flaws earlier on.)

You’re almost ready for PoC. Two big steps still lie ahead: estimating the benefits of your project; and a thorough look at the data landscape. Both are more difficult – and labour-intensive – than the steps covered in this article.

But for now it’s fair to say you might be on to something.

Previous
Previous

AlphaFold: from accuracy to application?