Components

Shop Floor Cognitive Accelerator | Toggle to view

Shop Floor Cognitive Accelerator hardware is a reconfigurable hardware platform built around a programmable System on Chip (SoC) that combines an embedded multi-core processor and an FPGA fabric to exploit both software- and hardware-based processing in an edge computing context. By moving computation close to data sources, the platform provides reduced latency (communications with the cloud infrastructure can be optimized) and increased privacy.
The Shop Floor Cognitive Accelerator Hardware allows developers to deploy their algorithms as dedicated application-specific accelerators on the FPGA fabric. Two complementary technologies ease the hardware-related development by abstracting the low-level technology dependencies, which are usually a wall for software-oriented developers: Dynamic and Partial Reconfiguration (DPR), which enables flexible hardware implementations in which the internal FPGA resources are multiplexed in time, and High-Level Synthesis (HLS), which enables hardware accelerator specification in high-level programming languages such as C or C++.
In addition, the Shop Floor Cognitive Accelerator Hardware features standard interfaces to establish communication channels with both the machines and/or sensors available on the Shop Floor and the cloud infrastructure, enabling an edge-to-cloud continuum.
Finally, the Shop Floor Cognitive Accelerator Hardware will include internal mechanisms to enable automated run-time self-monitoring and self-management. These mechanisms will be based on a dedicated on-board measurement infrastructure and embeddable Machine Learning (ML) models to estimate both performance and power consumption, and to make online decisions to move the platform to the optimal operating point.

Data Reduction Techniques & Fault Dependency Model | Toggle to view

Τhis component consists on the research of computational intelligence methodologies to develop a forecasting method able to deal with heterogeneous and possibly sparse information, in order to accommodate values acquired regularly (e.g. time series) and discrete events.
In particular, the component takes advantage of convolutional neural networks and time regressors to predict the progression of incipient faults. In parallel, the team investigates case-based reasoning approaches to estimate future outcomes. The strategy to be adopted for such construction assumes that the evolution of a set of similar conditions known in the past can be used and combined to estimate the evolution of current condition.
In addition, the models developed are sufficiently flexible, in the sense that they are able to provide estimates even when the available information is not complete or is somewhat sparse. Moreover, they are able to deal with different types of data, namely time series and discrete events, as can be found in the KYKLOS use cases.
This technical component is mainly devoted to deal with effective problem-dependent reduction methods for high dimensional data in order to allow an intuitive and reliable interpretation of readings, reducing data complexity, as well as with the use of an intelligent framework for PHM at a component level, with the ability of following up the overall health status, by relying on time-dependent condition-based features or indicators.
It includes functionalities for data pre-processing, focusing on outliers’ detection and dimensionality reduction, for fault diagnosis, estimating the overall system’s health status (degradation level), using a set of relevant condition-based features or indicators, and prognostics for predicting the evolution of the health status indicator.
Machine learning based methodologies and other computational methodologies will be used for outliers’ detection and accommodation, data dimension reduction and data driven health condition progression.
This component provides a time-dependent predictive model that will be directly used within Task 3.4, in the implementation of the inference inspection and control engine for the autonomous online inspection, condition monitoring and control at the AM production plant.

Features:

  • Data pre-processing, focusing on outliers’ detection and dimensionality reduction.
  • Fault diagnosis for estimating the overall system’s health status.
  • Time-dependent predictive model.

Semantic Knowledge Base (SKB) | Toggle to view

The SKB constitutes a sophisticated knowledge base that deploys semantic technologies for the representation and fusion of domain knowledge (content and context) under a common reference schema called ontology.
n ontology models notions such as systems, actors and processes involved in the KYKLOS 4.0 related domains. The KYKLOS 4.0 ontology will be the basis for data integration and semantic fusion of information deriving from diverse data sources into an actionable knowledge graph. For the definition of the ontology schema and the semantic reasoning rules, knowledge from domain experts is acquired. Known methodologies include questionnaires, ad-hoc interviews and more. The NeOn methodology has been adopted, which proposes the definition of Competency Questions (CQs) to act as a requirement elicitation method for the ontology.
The SKB component also provides the required infrastructure, by deploying an RDF triplestore. RDF triplestores are graph databases that allow loading ontology schemas and enable their population via (but not limited to) the SPARQL query language. Moreover, the SKB allows the application of semantic reasoning rules with the form of SPARQL queries towards uncovering underlying knowledge, inferring new interconnections between concepts and retrieving knowledge with complex queries. The free editions of popular RDF triplestores (e.g. GraphDB, Stardog) have been evaluated and the most appropriate was selected and deployed. The interaction with SKB is performed using the Semantic Knowledge Base Service (SKBS).

Features:

  • Sophisticated knowledge base that deploys semantic technologies.
  • Representation and fusion of domain knowledge (content and context) under a common reference schema.

Semantic Knowledge Base Service (SKBS) | Toggle to view

Semantic Knowledge Base Service is developed as a RESTful API that serves requests for knowledge insertion and retrieval to/from the Semantic Knowledge Base via HTTP requests by other system modules.
The interaction with Semantic Knowledge Base (SKB) requires a) a good grasp on concepts and relationships defined in the KYKLOS 4.0 ontology, and b) a basic understanding and experience with semantic technologies like SPARQL. Thus, the integration of SKB with other components in the KYKLOS 4.0 system would be challenging, requiring from other technical partners to familiarize with these concepts and be constantly updated with the latest version of the ontology.
To this end, we developed SKBS as a RESTful API that serves requests for knowledge insertion and retrieval via HTTP requests from other system modules. Each exposed service will receive certain parameters and construct the appropriate SPARQL queries. Acting as an abstraction layer, SKBS also serves as a security mechanism that limits external access to the SKB, which could potentially lead to invalid schema structures and non-conforming data. SKBS also manages the on-demand application of semantic reasoning in the form SPARQL expressed rulesets towards the inference of new knowledge. This is a modular mechanism, enabling the definition of different rules according to the use case scenario of discourse.

Features:

  • Request serving for knowledge insertion and retrieval via HTTP requests from other system modules.
  • Manage the on-demand application of semantic reasoning in the form SPARQLexpressed rulesets towards the inference of new knowledge.

Inference Engine | Toggle to view

This technical component includes the inference inspection and control engine for the autonomous on-line inspection, condition monitoring and control at the AM production plant. Its operation is based on advanced diagnosis and prognostic tools that are needed for rapid detection, identification, and isolation of faults in Cyber-Physical Systems (CPSs), such as those of the KYKLOS 4.0 project, so that proactive and predictive maintenance actions are taken to avoid failures and improve asset availability, as well as diagnose issues with the overall manufacturing line.
Its main goal is to calculate deterioration rates, i.e., Health Indicators (HI) or the Remaining Useful Life (RUL) using data-based intelligent approaches using current and previous conditions data. This information can also be used to predict failure in subsequent operation periods (i.e., production of defective products); and decide on possible alarms and activation of predictive maintenance strategies.
The component uses reduced dimension data, both off-line historical data and on-line (real-time) data, collected from the sensorial network, and considers models for the prediction of failure and degradation, as well as measurements an optimal diagnostic strategy will be defined (machine, artifact, process diagnostics).
This optimal diagnostic strategy is based on machine learning techniques (such as k−Nearest Neighbor and Support Vector Machine (SVM), and Artificial Neural Networks) with feature extraction that is considered by the inference engine.
The main benefits of the component are a slim representation of the sensor inputs that can be used for interpretability, and the support for predictive maintenance providing health condition indicators and RUL estimation.

Features:

  • Calculate deterioration rates.
  • Predict failure in subsequent operation periods.
  • Decide on possible alarms and activation of predictive maintenance strategies.

Production Equipment Clustering (PEC) | Toggle to view

The component extracts knowledge about the equipment, cluster machines which could be located either within a location or several, identifies groups of machines demonstrating similar behaviour, utilizes defined performance metrics by the end-users and domain experts.
The component is able to extract “useful knowledge” about the machine fleets, instead of just “new information”.
It uses a novel clustering methodology based on feature engineering techniques for all types of datasets (numerical, categorical, timeseries, mixed). Unsupervised machine learning algorithms are used such as: Self-Organizing Maps (SOM), Gaussian Mixture Model (GMM), Kmeans, K-modes, K-prototypes, Hierarchical clustering, and more.

Features:

  • Extracts knowledge about the equipment.
  • Cluster machines which could be located either within a location or several.
  • Identifies groups of machines demonstrating similar behaviour.
  • Utilizes defined performance metrics by the end-users and domain experts.
  • Extract “useful knowledge” about the machine fleets.

Maintenance Scheduler | Toggle to view

The Maintenance Scheduler component supports the specialists in accomplishing their mission of maintaining the fixed means of equipment (machines, installations, equipment) in working condition.
This component supports the specialists in accomplishing their mission of maintaining the fixed means of equipment (machines, installations, equipment) in working condition, at their nominal parameters and in safe conditions, throughout the duration their living, with minimal costs and without affecting the production.
Maintenance Scheduler aims to schedule the workloads on the equipment, without affecting the production time. For this purpose, a genetic algorithm for automatic task planning for the factory equipment is used.

Features:

  • Preventive Maintenance: Automatic Trigger of Maintenance equipment schedule.
  • Preventive Maintenance: Work Instructions for maintenance activities as drawings and in digital format.
  • Preventive Maintenance: Database of issues for each equipment when maintenance is performed (by scanning the equipment).
  • Preventive Maintenance: Equipment status, show the issue on the equipment.
  • Preventive Maintenance: Automatic spare parts trigger and delivery based on planning.
  • Corrective Maintenance: Automatic ticketing, user based – know who is taking the issue and how much time is recorded to solve it, document de action.
  • Corrective Maintenance: Automatic issue triggering from equipment and monitoring it in Continental MES system.
  • Corrective Maintenance: Automatic dashboards and escalation process in case of break down.
  • Predictive Maintenance: Monitor of the line and possible defects – subcomponents wear off.
  • Predictive Maintenance: Alarms in case of subcomponents lifetime is coming to end (Cylinders, pins, etc.).
  • Predictive Maintenance: Predict based on history what component has a high risk of damage (replaced to many times, critical for equipment, etc.).
  • Predictive Maintenance: Overview in terms of cost (optimize) for spare parts, stock adjustment for spare parts.

Deep Learning Toolkit | Toggle to view

The KYKLOS 4.0 Cognitive ToolKit (KCTK) is a framework offering a bunch of modules and functionalities enabling advanced data analytics for industry 4.0 and leading to more informed decisions.
Although this toolkit is developed to address the needs of the KYKLOS 4.0 ecosystem, many parts of it are generic and can be used in a broader context. In the context of the pilots, the KCTK will be used in the wheelchair production to support design and customisation as well as in predictive maintenance.
Thanks to the digitalisation of the manufacturing processes and the use of sensors, a huge amount of data is generated. Analysing this data and providing insights will help in optimizing the processes and predicting failures of the machines in the shop floors. Since the amount of generated data is huge and the inability of humans to deal with, support from machine learning algorithms is needed.
The Cognitive ToolKit addresses this issue by offering a bunch of functionalities enabling advanced data analytics for circular economy and leading to more informed decisions. In the context of Kyklos 4.0, the cognitive toolkit is based on a variety of algorithms such as DL and Principal Component Analysis, and will be used in the wheelchair production to support design and customisation as well as in predictive maintenance. To this end, the development of this toolkit will go through the following steps:

  1. Understand in more details the pilot scenarios with the support of the pilot hosting partner;
  2. Pre-analyse the data that will be available.
  3. Determine the appropriate ML/DL algorithms.
  4. Collect data and start the training/testing/refining.
  5. Integrate the toolkit with the rest of the KYKLOS 4.0 components.

The main benefits of the cognitive toolkit are:

  1. Offering recommendations for optimal end-products in the design phase.
  2. Predicting when the next machine maintenance should be performed.

Features:

  1. Offering recommendations for optimal end-products in the design phase.
  2. Predicting when the next machine maintenance should be performed.

Augmented Reality-based Re-configurator Tool | Toggle to view

Augmented reality-based re-configurator tool provides (near)real-time information (that comes from sensors) of shopfloor machines in Augmented Reality when the operator is in front of the machine.
The AR-based re-configurator tool provides an effective and easy way to create AR panels and visualize that information based on different profiles. It consists of a web-based authoring editor to configurate AR panels, data provided by IoT/sensors, simulations and alarms and a visualization interface for mobile devices and HoloLens. The two aspects integrate seamlessly to create an integrated solution applicable to industrial sectors.

A typical user scenario could be as follow:

  • The shopfloor manager adds a new machine to the system via the web backend, configuring the panel and the controls (with their variables) where the information will be shown. It is possible to define calculated variables that are not coming from IoT/sensors but perform calculations using formulas. It is also possible to configurate alarms to be shown in AR based on thresholds.
  • The shopfloor manager, once the machine has been added, defines through the web backend what type of trigger will activate the augmented reality visualization.
  • Then, shopfloor manager publishes from the backend that information so that it can be consumed by the mobile devices, defining the type of users and roles that will have permission for it.
  • From that moment any worker of the shopfloor with permissions will be able to trigger with his device the visualization of the information from sensors in Augmented Reality.

The main benefits obtained when using this component in the shop floor is related to productivity and efficiency of the operator’s work, allowing the worker to see in the same place information from different sensors to know the status of a machine while the operator is interacting with the real world. By changing specific data of the variables, which contains the sensors’ data, the worker can see the impact on other variables to take decisions resulting in a production cost reduction, reused materials or energy savings to promote a sustainability and circular economy.

Automated Task Planner Toolkit | Toggle to view

The Automated Task Planner (ATP) takes a product list of requirements and generates the best production line able to create the designated product.
An automated production plan allows for faster response to market requests, improving ROI on flexible production lines, increasing market availability of scarce products and improving reliability and resilience of production.
The ATP takes into account all the available Primitive Functional Systems, the smallest production units that can be configured, and all the connection possibilities between them. The best configuration that achieves the stated product requirements will be the result of the optimized planning.
Working closely with the Virtual Production Line Orchestrator, the Automated Task Planner can generate production lines comprised of Primitive Functional Systems located in different sites, allowing for decentralized operations, and adding resilience to local constraints (ex: floods).
The use of ever-more flexible Primitive Functional Systems based on AM technologies leads to optimized production lines that can be reconfigured in a larger set of scenarios. The automation of production lines configuration based on product requirements, allows for smaller production batches, tailored to meet each customer specific needs, enabling less surplus and less wasted resources.

Features:

  • The Automated Task Planner (ATP) takes a product list of requirements and generates the best production line able to create the designated product.
  • It can generate production lines comprised of Primitive Functional Systems located in different sites, allowing for decentralized operations and adding resilience to local constraints.

Rapid Prototyping Module | Toggle to view

The Rapid Prototyping Module function as an AM support toolkit, being able to be accessed through the Web.
The Rapid Prototyping Module employs state-of-the-art decision-making algorithms to assist the user in selecting an appropriate AM process, machine, material, or combination of the above, when adapting AM as a method. The user will be able to define a set of alternatives (available equipment, processes of interest, materials of interest, etc.) and set weighted values in predefined criteria through a dedicated web interface. According to the criteria, the Rapid Prototyping Module provides a sorted list of the optimal process/material combinations tailored to the user’s application.
After having selected the desired material and technology for their application, the user will be able to define a loading case for the part (i.e. boundary and loading conditions) as well as some initial printing parameters (i.e. Layer height, infill density, etc.). The part will then be transferred to the simulation toolkit and the user will receive indications on the structural performance of the part in static loading conditions (i.e. stress/strain components).
The user will be able then to optimize the part’s (i) printing parameters, (ii) orientation towards a user-defined goal.

LCA Simulations Engine | Toggle to view

The component constitutes a professional web-based tool to collect, analyse and monitor the sustainability performance of products and services. With this tool, it is possible to measure the environmental impact of the products and services across all life cycle stages.
The tool can detect the hotspots in all aspects of the supply chain, from raw materials to manufacturing, distribution, use and disposal.
LCA Simulations Engine aims at quantifying the environmental impacts that comes out from materials and processes inputs and outputs, such as energy consumption or air emissions, over the entire life cycle of a product, process, or service to assist industries and consumers in making decisions that will benefit the environment through the establishment of the circular manufacturing and economy. The tool detects the hotspots in all aspects of the supply chain, from raw materials to manufacturing, distribution, use and disposal. Practical and concrete environmental data are introduced with indicators which enables the elaboration of comparative studies and the estimation of the progress over the environmental targets which enables the continuous improvement in the production.
The component receives data (regarding the environmental load of a product or process) from other components via their APIs and calculates the environmental impact of a product or process and reports it to the user in real-time. The component provides information about the environmental impact in each part of the life cycle (e.g. production, use, end of life) and through its API the LCA calculations are retrieved by other components. The various data are analysed through state-of-the art LCA methodologies.

PLM Module | Toggle to view

The PLM Module serves as a data hub to collaborate, interact and record the different configurations, methodologies, production techniques, decisions and actions created in KYKLOS 4.0 processes.
The PLM Module provides the necessary traceability and configuration control, as part of the Component Management Platform, to support the rapid prototyping and personalised designs of the system and the logistics support and handling capabilities. Through using international standard formats for the data exchange and data storage, the PLM Module is key to KYKLOS 4.0 by enabling consumer products manufacturers to integrate products and components that have been independently designed, produced and used, in addition to the manufacturing system itself.
Using ISO 10303 standards for data exchange, sharing and archiving (file exchange, APIs and web-services) the PLM module provides a central product data repository for all applications in the project. At the same time the standard-based approach allows also project external legacy system to connect, which is of great benefit for the commercialization of the project results.

The PLM module applies the following methods:

  • Data interoperability: ISO 10303, STEP
  • Data dictionary: ISO 10303-239
  • Data import/export:
    • ISO 10303-242 AIM P21
    • ISO 10303-242 BOM XML
    • ISO 10303-239
  • IoT integration: Arrowhead Framework.
  • DBMS: EXPRESS Data Manager (Jotne product)

Onboarding details of IoT Arrowhead Framework: https://jotne.atlassian.net/wiki/spaces/EDM/pages/3374579716/On-boarding+Document+Arrowhead+Framework+Integration-EDMtruePLM+ISO10303+repository

Features:

  • Data hub to collaborate, interact and record the different configurations, methodologies, production techniques, decisions and action in an environment using ISO 10303 standard.

Manufacturing Management Component | Toggle to view

Manufacturing Management component aims to support the factories that produces customized products in small series.
The process of configuration of the products is facilitated with the help of the harmonized components that will be developed within this project. The component provides the environment in which the specialist manages the received orders and will track the production of the products until delivery date.

The main functionalities of this component will be the following:

  • Specific features for production tracking will be implement (the data related to the materials and the production status will be displayed into an organized manner).
  • Sales Order fulfilment – (track the availability of required product, synchronize the dales order from multiple channels into a single one.
  • Implementing the dashboards to track the materials, purchasing, cost with materials and productions operations.

At the front-end level, the information received from the pilot system will be displayed in a centralized manner in order to help but the main functionality of this module is to provide an integration layer between MES and PLM/KYKLOS 4.0 Backend.

With the help of this prototype, the specialists:

  • will have a centralized situation of the received orders for special products.
  • will easily manage the current stock and the bill of material.
  • will make, assemble, or configure the order more easily and precise according to the received requirements from the customer.
  • will have a clear view of the final product, according to the virtual simulations.
  • will reduce the time of testing the product, due to the tools of virtual configuration and 3D visualization.
  • will track & schedule easily the production process with the help of an intelligent calendar and automatic planner tool.

Features:

  • Production tracking.
  • Sales Order fulfilment.
  • Resource workflow management.

Blockchain-based Auditing Platform | Toggle to view

Audit trails are information records that register important events to provide support documentation and history useful for security and operational actions, or challenges mitigation, including details about date, time, and user information associated with the event. It provides a baseline for audit or analysis when needed (an error has been detected, risk management etc.).

In KYKLOS, the auditing functionalities are implemented by means of Smart Contract deployed in Blockchain, which provides a tool with the following benefits:

  • It maintains information integrity; information recorded in Blockchain cannot be tampered with. If a transaction would be incorrect, the new modified transaction must cancel the historical transaction out when being entered. By this way, the prevention and detection of fraud is almost unnecessary.
  • It guarantees authenticity; Blockchain provides an unalterable guarantee of origin.
  • It guarantees long-term maintenance of audit information; in fact, once information has been entered into the Blockchain it can never be erased; consequently, the audit trail will always be traceable.

The Blockchain based auditing system will provide the following functionalities:

  • It provides the intelligence to all technical components in KYKLOS for providing interesting information to be audited. Each component will have full control over the information related to it; this means that only the component can provide or modify its associated information.
  • It provides information recording in the Blockchain, taking advantage of its inherent benefits (integrity, authenticity, long-term maintenance…).
  • It provides a graphical interface for auditors (official auditors, system administrators, etc.) to easily look at the information recorded in the Blockchain from the different technical components in KYKLOS.

Data Manager | Toggle to view

The Data Manager component is used for the definition and modelling of manufacturing processes using a logic-based framework and following the Business Process Model and Notation (BPMN) standard.
The component provides the user with the ability to manage and allocate resources in a timely and efficient manner as well as to monitor a specific manufacturing process and its relevant data. Through its intuitive interface and design, it enables better designing, engineering, and planning of manufacturing processes and simultaneously provide the user with the ability to compare and contrast the current processes with optimal process scenarios thus leading to business process optimization.
The Data Manager Tool aspires to be a central point of focus to the interplay of Smart Production Systems, Data Analytics, Cyber-Physical Systems, Internet-of-Things and the potential Business Process Management improvements. Providing the ability for a Business Process Management approach to manufacturing operations with a new look at how manufacturing processes are defined and supported by enabling technologies. Thus, taking a holistic approach in managing industrial business processes and providing a process orchestration platform to define, model and execute these work processes, ensuring that the necessary data sources, applications, events, and stakeholders are always integrated.

Features:

  • Manage and allocate resources in a timely and efficient manner.
  • Monitor a specific manufacturing process and its relevant data.

Parametric Design Methodology | Toggle to view

Such methodology is based on the implementation of parametric design and is potentially applicable to all products that are supposed to be customized on the final user anthropometry.
The methodology is firstly based on the definition of a correspondence between some specific user anthropometric measurements and product sizes and translate it into rules for product size selection and parametrization. Secondly, State-of-the-art commercial Software is used to develop a parametric model of the product for a fast reconfiguration of the whole assembly, accordingly to the product size selected. As last step, the use of 3D virtual manikin model is foreseen to validate such parametric design process, to quickly and virtually validate any new feature/functionality/accessory of the product that are supposed to be affected by user anthropometry.
In case of products where changes in sizes of patients are supposed to affect product life, the use of virtual manikin would allow to simulate patient size changes and to estimate product life. Finally, the use of 3D virtual manikin gives the chance to easily simulate a panel of specific characteristics of user body (i.e. pathology, deformity, etc.) to test developed product on a high range of virtual users.

Features:

  • Reconfiguration of a product size accordingly to some specific anthropometric measurements.
  • 3D virtual manikin model is foreseen to validate such parametric design process.

Recommendation Engine | Toggle to view

The recommendation engine paves the way to efficient customized production. Manufacturers are able to create more personalized products which will better fit to their customer’s needs thus reducing unnecessary parts, costs, human involvement, and waste.
It translates the user’s anthropometric measures into sizes to identify the best fitting components. It translates non-anthropometric characteristics related to the user’s preferences, habits, medical conditions, and lifestyle into behaviours and finally based on the anthropometric profiling and the behavioural profiling, identifies all the needed components to design a wheelchair that fully satisfies all. The function uses a forward chaining algorithm that simulates a human expert behaviour to create the most appropriate configuration. The algorithm scans anthropometric profiling and behavioural profiling discarding all non-feasible components and retaining only those to be recommended.

Features:

  • Anthropometric profiling – translates the user’s anthropometric measures into sizes to identify the best fitting components.
  • Behavioural profiling – translates non-anthropometric characteristics related to the user’s preferences, habits, medical conditions, and lifestyle into behaviours.
  • Recommending best fitting components – based on the anthropometric profiling and the behavioural profiling, identifies all the needed components to design a wheelchair that fully satisfies all. The function uses a forward chaining algorithm that simulates a human expert behaviour to create the most appropriate configuration. The algorithm scans anthropometric profiling and behavioural profiling discarding all non-feasible components and retaining only those to be recommended.

Advanced Additive Manufacturing Component | Toggle to view

The main objective of this toolkit is to propose a database tool, summarizing the different materials and process properties, which can be accessible in a Web Environment through the Rapid Prototyping Module.
This is a specific software module able to compute the stress-analysis of AM components fabricated by Fused Filament Fabrication (FFF) or Stratoconception technology. In both cases, the standard parameters of the raw material available into generic material database are useless. Instead, a suitable data must be provided accounting for the actual material deposition patterns as well as the filament-to-filament or the layer-to-layer adhesion.
To achieve this main objective, we established a complete list of all materials used by the consortium partners. Currently, we are producing a state-of-the-art report of the materials properties and process parameters already available for each listed material, and we are finding the missing information for each listed material through new mechanical tests and process parameters analyses.
The main innovation of this toolkit is that it is a resource available to all KYKLOS 4.0 users who need information on AM materials: not only mechanical information, but also design information like aspect, transparency, flexibility, etc.

Features:

  • Propose a database tool, summarizing the different materials and process properties.
  • Compute the stress-analysis of AM components fabricated by Fused Filament Fabrication (FFF) or Stratoconception technology.

Web 3D Modelling Component | Toggle to view

The Web 3D Modelling Component is responsible for providing a web-accessible parametric 3D Customization platform.
This application provides the user with the ability to import 3D geometries or access predefined parametric products. Modify them, store, share and export the necessary data that accompany each design.
The Online 3D Modelling Component presented function as a parametric configurator being accessible through the web. It employs state of the art technologies for 3D graphics on a web environment able to manipulate and configure 3D models not only using surface mesh visualisation, but also employing powerful topological data structures according to Boundary Representation formats and established standards, such as ISO10303.
The presented tool leverages a variety of technologies to deliver robust 3D representation capabilities on the web and can be potentially used for the reinforcement of SMEs and other companies, towards support in the production phase. This can be accomplished by providing parametric configuration capabilities to the products directly from the web and tightly couple the output with the production line, being able to manufacture such products on demand.

Augmented Reality-based Content Editor | Toggle to view

The Augmented Reality-based content editor tool provides an easy-to-use authoring tool to create manuals in AR (with different steps to guide a worker to perform a task) and a visualization tool to visualise the AR manual created before by the worker when performing the task.
Instead of using pdf manuals (digital or in paper) this component allows to create AR manuals in an easy way without knowing programming languages. This component is focused on learning on how to develop a task or how to maintain a machine.

A typical user scenario could be as follow:

  • The shop floor manager adds a new manual to the system using the editor, adding all the multimedia assets (3D files, pdf, videos, photos, etc.) and defining, step by step, the entire manual.
  • The shop floor manager, once the manual has been added, defines through the editor what type of trigger will activate the AR display.
  • Then he publishes from the editor that manual, so that it can be consumed by mobile devices, defining the type of users and roles that will have permission to do so.
  • From that moment any worker of the plant with permissions will be able to trigger with his device the visualization in AR of that manual for helping the shop floor manager when performing a task (the pattern is triggered, when the operator is in front of the machine).

From one hand, one of the benefits obtained when using this component in the shop floor is related to the reduction of the time needed to perform a task or a maintenance process impacting directly in the production costs and the energy consumed by the machines. On the other hand, this component allows personalization in terms of information provided in the AR manuals as well as enhanced support in the assembly/operating/maintaining/fixing operations thus reducing potential errors and execution time and enabling not specialized person to execute these tasks. And in the case of paper manuals, it is eliminated the use of paper needed to have a step-by-step manual.

Virtual Production Line Orchestrator | Toggle to view

The Virtual Production Line Orchestrator uses Digital Twins Technologies to implement, execute and monitor production workflows that could be inter- or intra-company.
The main foreseen scenarios are the production of personalized components, replacing existing components with new one (circularity by design).
Main functionalities:

  • Identify suppliers – aims to identify where there is a machine with needed characteristics available.
  • Negotiate terms of collaboration – costs, date of deliverable, specifications.
  • Update and synchronize the workflow – Create virtual production line (workflow) with new (virtual) machine producing components.

This will benefit:

  1. Ability to create a virtual production line making the best usage of assets availability either from inside or outside the factory to response to different personalization needs.
  2. Continuous operation control, flexible and scalable maintenance.
  3. Optimize the manufacturing process.

KYKLOS 4.0 Marketplace | Toggle to view

The Marketplace is KYKLOS 4.0 hub, connecting all suppliers and customers, allowing for the easy interconnection between offers and requests.
Tightly connected to the component, Broker and Matchmaking, customers in the Marketplace have access to the best products and services provided by suppliers where dynamic pricing is a major component as all the requests from end-users and available services of the plants/factories are handled in real time, picking the best possible, at that time, combination of qualitative service and best cost for the user.
Real-time notifications forward any change to services/prices to end-user and provide the status on production outcome.
The Marketplace gives the opportunity to re-configure materials and available services to create customs products or to act as a virtual manufacturer. This dynamic interaction, powered by agent technology, supports a client-centered operation. Smaller production batches, customized to the client needs, improve total market efficiency by reducing product surplus and bringing manufacturing capability into reach of all product designers.

Features:

  • Connect customers and suppliers.
  • Easy interconnection between offers and requests.
  • Give access to the best products and services provided by suppliers.

Brokering and Matchmaking | Toggle to view

The Brokering and Matchmaking component supports semantic matching in terms of manufacturing offers, in order to find the best supplier able to fulfil a given request posted to the Marketplace, for services or products.
The component uses decision criteria, both qualitative as well as quantitative, to choose the best supplier.
The broker component connects to the Marketplace to enable negotiation among parties, as automatically as possible.
This component is part of the marketplace and it is essential to match the end-user with the best production line and material to his needs.

KYKLOS 4.0 Front-End | Toggle to view

The Component provides access to overview data and visualizations, allowing for a holistic view of the platform.
Data from individual components could be correlated with other relevant modules so that the information can be complimentary to the module’s interfaces adding value to the platform as a whole and unifying it at a higher level of abstraction.
This is a helper component that integrates a broad set of innovative components into a single platform.

KYKLOS 4.0 Back-End Infrastructure | Toggle to view

This component acts as the KYKLOS 4.0 cloud-based back-end infrastructure, storing and delivering the Consortium generated data to other components regarding their different functionalities, mainly by means of the interoperability layer and its REST API.
Therefore, access to data will be granted to every KYKLOS actor without a large set of configurations, reducing the interaction complexity while preserving the quality of the data.
Thus, the KYKLOS 4.0 backend represents the back spine for data storage and retrieval within the architecture, working in conjunction with the interoperability layer to communicate data all through the project. Nevertheless, to this end the backend itself will provide connectivity by means of several connectors and standards concerning connectivity –REST, Modbus, MQTT, etc.- and data formats, as well as a graphic web interface to inspect these operations’ results.
It will be developed iteratively along with the Interoperability layer from a set of basic conditions, fitted to the use cases, getting ahead integration problems.
The benefits from this component will be the capability of delivering and storing data in a seamless way regardless of its source or latency, allowing different user roles to manage and inspect the informational assets. Data will be leveraged, and in this manner the backend will allow to enhance the circular manufacturing model, by getting ahead of system failures -predictive maintenance- and enhancing manufacturing processes -by reusing information and finding patterns- for a wide range of purposes.

Features:

  • Data storage and retrieval management
  • Data filtering (thresholds) and processing (transformations)
  • Stored data visualization
  • Stored data cloud sharing
  • User-based access control
  • Extraction process monitoring
  • Provide connectivity by means of several connectors and standards concerning connectivity –REST, Modbus, MQTT-

Product Refurbishment Certification | Toggle to view

This component produces a condition certificate on a product/equipment, after refurbishment, said product/equipment may be made available in a trustworthy condition, “as-new”.
This component provides the tracking tools to trace the life events of assets and assets components, enabling timely repairs. The certificate issued will contain the general status, as well the status per-component and expected remaining useful life (RUL).

Data Pipeline Orchestrator (DPO) | Toggle to view

The starting point are some Open-Source components, like Apache Streampipes, Grafana and time series database, which are orchestrated by DPO to perform the required Data Analysis and Visualisation. Such a platform is used in industrial pilot projects where tools for data preparation, integration, feature extraction and visualization are needed. Various data pipelines can be realized for different purposes to satisfy different tasks.
The DPO platform is an Open-Source flexible and cost-effective solution for data integration thanks to an original pipeline of reliable state-of-the-art components developed and maintained by leading OSS foundations and associations. This is ideal for users who want to set up a coherent data processing pipeline at very short notice, where advanced analysis and forecasting, also based on AI, can be performed. In this last perspective, DPO can therefore be understood as a “Data hub for AI” platform.
Data Pipeline Orchestrator orchestrates the time series databases to perform the required Data Analysis and Visualization.
Various data pipelines can be realized for different purposes to satisfy different tasks.
The DPO platform is an Open-Source flexible and cost-effective solution for data integration thanks to an original pipeline of reliable state-of-the-art components developed and maintained by leading OSS foundations and associations.
It can set up a coherent data processing pipeline at very short notice, where advanced analysis and forecasting, also based on AI, can be performed. DPO can therefore be understood as a “Data hub for AI” platform.

KYKLOS 4.0 Interoperability Layer | Toggle to view

The Interoperability Layer works closely with the back-end component to provide small, fault-tolerant processes for transforming and manipulating data, while providing high performance.
It allows monitoring and managing connections to all defined sources, easily creating links between components in a scalable, secure and reliable way. At the same time, the IL component has different means to check both the data extraction processes and the extracted data. For this purpose, the Interoperability Layer offers a graphical user interface.
It applies the definition of a data exchange structure, adding enhancements such as restrictions, units or transformations, being able to deliver notifications about the process. It focuses on data delivery, regardless of latency, format, quality and purpose of the data.
An approach to each particular use case is used, especially with respect to its specific protocols and needs, obtaining a more generalized application.
The benefits of this component are the ability to provide direct communication between the backend and the rest of the KYKLOS 4.0 Ecosystem in a clear way, thus centralizing and facilitating data management and security, while enabling Industry 4.0 processes through data exploitation.
The platform is prepared and tested to connect and collect data from real environments, processing sources in real time and in batch mode. It is able to perform transformations on the data, and make the user able to manage and exploit all the data by gathering them in a single source (back end). It implements several connection protocols, and provides different means to verify and share the collected data in a quick manner. It offers an autonomous authentication and authorization system to control access to the data.

Decision Support System | Toggle to view

The Decision Support System component (DSS) combines Data Analysis, Data Evaluation & Data Construction with knowledge from domain experts in the manufacturing industry to automatically optimize processes that can be optimized with no human intervention.
Decision Support System component identifies business patterns, trends, and cause-effect relationships, and provide stakeholders with appropriate suggestions (courses of action) based on real-time information available. In the short run, asset tracking, reusing, and recycling of past production materials and resources will become feasible for the user. In the long run, the user will be able to utilize the DSS insights to formulate a long-term strategy of cyclical manufacturing practices such as efficient reusing and recycling methodologies, waste monetization, and suggested courses of action with regards to high level strategic business needs.
The three principles of a circular economy are: design out waste and pollution, keep products and materials in use, and regenerate natural systems through the reuse, recycle and waste monetization.
The target design philosophy of our component is the definition and creation of a circular manufacturing company as a digital entity modelled around the specific traits and characteristics of its real-life counterpart. Through this process we hope to be able to measure the aspects that enable company-wide transformation by measuring the entire company’s circular economy performance, not limited to just its products and material flows and by supporting decision making and strategy development for circular economy adoption.
A software that provides the user in each production phase recommended actions every time an alert is triggered but also guidance on a long-term basis of ways to help the user better plan and organize the production process. The goal is to build a product that automates as many capabilities as possible and reduces the need for human intervention as much as possible.

Features:

  • Automatically optimizes processes that can be optimized with no human intervention.
  • Identifies business patterns, trends, and cause-effect relationships.
  • Provides stakeholders with appropriate suggestions (courses of action) based on real-time information available.
  • The users will be able to utilize the DSS insights to formulate a long-term strategy of cyclical manufacturing practices.

For more information please visit IoT Catalogue.