Artificial Intelligence and Machine Learning are appearing everywhere across the engineering sector currently. It’s the shiny new technique that everyone wants to be using. However, adopting any new technology is disruptive to a business, and it can be challenging even to approach. This article will discuss how to progress from the first discovery and trial to deploying the solution across an entire company. We’ve split this into three distinct stages:
- Crawl – prove the concept on one problem.
- Walk – expand the number of applications and accelerate whole workflows.
- Run – collaborate with customers and colleagues, be embedded within the company.
Within each section, we'll discuss technical and commercial elements to consider in each stage. As we conclude this six-part series in adopting AI, you will have gained all our guidance to ensure you are successful in your ambition to get up and running with AI in your business.
Starting to crawl
As cliché as it sounds, the adoption of AI is a journey. It's a risk that requires an adventurous spirit and the mindset to take on the challenge (see our article about ‘Is your engineering organisation ready for AI?’). The key is that you need an achievable plan! Before you begin, you need to line up the resources, the team members and prepare for the effort you will have to put in. The best suggestion is to start small – both figuratively and literally! The purpose of the ‘Crawl’ phase is to prove the concept.
Define a problem to start with
AI cannot solve all our engineering problems. It’s a common misconception. The risk is that in a rush to keep up with advancing technology, you're not applying AI to a problem it can help you solve, which leads to disappointment and frustration in the long run. The key to success is having realistic expectations. At Monolith, our team works with you to define these expectations and set you up for success.
To start with, identify issues causing you and your business hassle. Then, determine which of these issues can be solved by AI. Within this article, we refer to AI as finding patterns or creating models based on data. This ability is helpful for engineering problems that are either highly complex or very repetitive. Both types generate the core ingredient for a successful AI problem: data for algorithms to learn from.
Then we need to look more closely at the available data to determine how feasible the problem is to set up:
- What data is available for that use case?
- How easy is it to create more?
- How is the data structured and stored?
- How frequently is the data produced?
- What simulation, test, or manufacturing methods generate it?
- What does the information look like at each stage of the process?
Collecting, cleaning, sorting, and structuring data to apply AI is a daunting task, but we limit the required amount by limiting the scope during this first stage.
Reduce your scope
Once you’ve selected your challenge, immediately reduce the scope. I can almost guarantee you’ll have gone too big too soon because everyone gets excited and does! The initial objective for an AI project is to gather enough proof to prove that investing more time and resources will be worth it. You don't want it to be an open-ended exploration. Set a challenge that you can complete in 1 to 3 months and know how you can measure whether you’ve been successful. Simplify your problem, limit the number of products or the number of test conditions, e.g., testing only one type of yoghurt pot but for a range of different drop tests. The dataset to describe this should be small to medium in size, with 100s to 1000s of data points. Have five input variables that you want to investigate, and limit yourself to 3 to 5 output variables. Capping the number of variables keeps the problem to a realistic number of dimensions described by the number of available data points. To find out more about what is achievable with different quantities of data, see our article ‘How much data do I need to use AI?’.
It’s a team game
Who is accompanying you on this adventure? It can't be a solo mission to adopt AI, so identify your team. Monolith AI is designed to give engineers the cutting-edge tools that data scientists are developing. Who will benefit most from you resolving this problem? Who understands the dataset you wish to use the best? Alternatively, do you have knowledgeable data scientists already but no method of disseminating their models into other functions? Finally, your team needs a champion, this is going to take time and effort, and you need an advocate from your organisation who is engaged and can facilitate and unlock access to the data required.
Testing the capability
In this phase, most organisations are looking to test the capability; you are trialling how applying AI can help you solve engineering problems within your business. Can you understand something new about your data? Prove relationships between variables that confirm the existing knowledge in your team? Is it possible to predict an expected result ahead of time? Would it be possible to optimise a design using this model, or is there not enough information yet included?
Learning a new skill
Part of this first stage is also about understanding the strengths and weaknesses of machine learning, so take the opportunity to get familiar with the different algorithms available and when best to use them. Within Monolith, there is a range of models available, and we recommend spending some time testing the different types of machine learning models. See what influence changing model type or model parameters have on the performance. You can evaluate models for best fit or compare the errors between the model prediction and recorded values. Engineers are naturally curious and need convincing that these models are doing the correct thing and understanding how they work.
Working out where to go next
To progress from the 'Crawl' stage, you should have proved the concept of using AI to solve an engineering challenge you and your team are facing. You will have been able to develop in-house data science skills, understand the impact of a variable on performance, speed up the simulation lifecycle or test campaign through prediction, or optimise a new product for release. At this point, you might be able to visualise the value but will likely not have been able to realise that return just yet.