Fabrication Optimiser - SPO¶
Single Process Optimization
1. Product Description¶
1.1. Solution Overview¶
The AIDEAS Fabrication Optimizaer (AIFO) is a toolkit for single manufacturing process optimisation. The tool allows the correction parameters to be passed as input to a CNC machine in order to obtain the compliant part after the first re-manufacturing step.
Single process optimisation.
Quality controll.
Data-driven approach for bottleneck processes.
1.2. Prerequisites¶
This AIDEAS Fabrication Optimiser offers the following features:
It allows the optimisation of the critical production process by avoiding costly and time-consuming rework.
Providing greater control over the production process by taking into account various factors such as external temperature, number of remachining steps and other features.
Providing processing reports to be saved in a NextCloud server, allowing remote access to process parameters.
Providing the functionality of data validation and pre-processing to ensure that the input data feed to the model is in the correct format.
The tool can be used by non-expert users without knowledge of AI tools thanks to its simple and intuitive GUI.
• Techincal Specifications¶
The backend of the AI-FO is developed using python and FLASK as the framework for the API server. The backend provides the API endpoints with which the frontend can communicate to, send requests, and obtain the results.
The frontend of the solution is developed in REACT.
For deployment, docker is used since it is the most widely used containerization solution. Docker also makes it easy to deploy the packaged application into the runtime environment and is widely supported by deployment tools and technologies. The results of the machining process are saved on a NextCloud server. This makes it possible to keep track of all previous processes from any account with access to them.
• Technical Development¶
This AIDEAS Solution has the following development requirements:
Development Language: Python and Javascript.
Libraries: Numpy, Pandas, Scikit-Learn, Flask, pymongo, nextcloud-api-wrapper.
Container and Orchestration: Docker, Kubernetes.
User Interface: React, PrimeReact.
Application Interfaces: RestAPI.
Database engine: MongoDB, NextCloud.
• Hardware Requirements¶
AI-FO can run on any platform that supports Docker containers.
• Software Requirements¶
Docker Desktop (Windows, Mac, or Linux)
npm (for frontend deployment)
• External Dependencies¶
NextCloud (for report storage)
2. Installation¶
2.1. Environment Preparation¶
Ensure that all dependencies, including Docker, Python, and npm, are installed. Clone the repository from the official GitLab project and configure the backend and frontend environments as needed.
2.2. Step-by-Step Installation Process¶
Local Installation: Requires configuring backend and frontend, installing dependencies, and launching services manually.
Docker Installation: Uses a
docker-compose.ymlfile to deploy the application.Kubernetes Installation: Pending implementation.
3. Initial Configuration¶
3.1. First Steps¶
• Login¶
Users must log in using GitLab authentication before accessing secured application features.
HOME¶
Dashboard → Tab in which an introduction of AIFO is displayed and from which the other tabs can be accessed too.

Help → Tab with guidelines.

AI-FO¶
New Measurment → Tab where the user enters the measurement data of the machined component and can see any errors and corrective parameters


Storage Data → Tab where the user can add or modify NexCloud servers associated to the solution

3.2. Main Workflows¶
• Workflow Description (Step-by-Step) and Exapmples¶
By going to the ‘Start New Measurment’ section, the operator can enter the measurement data of the machining operation, and once all fields have been filled in, by clicking on the ‘Generate Results’ button, he can see the correction parameters to be passed as input to the machine. It will also be possible to view the graphs of the measurement performed to analyse whether the part conforms or not.

The data will be saved on a NextCloud driver where you will be able to see a complete processing report and additional information
4. General Queries¶
4.1. Installation and Configuration Contact (If Service Provided)¶
For installation and configuration support, users should refer to the official GitLab project or the associated organization:
UNIVPM: Mateo Del Gallo (m.delgallo@pm.univpm.it) Filippo Emanuele Ciarapica (f.e.ciarapica@staff.univpm.it) Giovanni Mazzuto (g.mazzuto@staff.univpm.it)
UPV: Miguel Angel mateo Casali (mmateo@cigip.upv.es)
ITI: Diego Silveira Madrid (dsilveira@iti.es)
4.2. Licensing and Support¶
Users can contact our support team at the emails listed in 4.1. section for assistance
Pricing and licensing details are available upon request.
Subject |
Value |
|---|---|
Payment Model |
Quotation under request |
Price |
Quotation under request |
5. Appendices¶
5.1. Glossary of Terms¶
AIDEAS: AI Driven Industrial Equipment Product Life Cycle Boosting Agility, Sustainability and Resilience
AI: Artificial Intelligence
AI-FO: AIDEAS Fabrication Optimiser
5.2. API Documentation (if applicable)¶
By default, the backend server is served on port 5002 and allows the following API methods. These methods are accessible through the application frontend, or by sending the proper request using tools like Postman, or directly with Python code.
Resource |
GET |
POST |
DELETE |
|---|---|---|---|
|
Supported |
||
|
Supported |
||
|
Supported |
||
|
Supported |
||
|
Supported |
||
|
Supported |
5.3. Console Commands List (if applicable)¶
npm installfor frontend dependencies.pip install -r requirements.txtfor backend dependencies.docker-compose up --buildfor Docker-based deployment.python Endpoint_API.pyto launch the backend server.npm run devto start the frontend server.