The KnowlEdge project is building a new generation of AI methods and tools applied to smart manufacturing. Secure and scalable mechanisms to collect, distribute and store a huge amount of heterogeneous data from various shopfloors are key to continuously feed the different steps of AI processes. KnowlEdge is implementing a modular data collection platform, with components deployed across the edge, fog and cloud domains, to feed distributed AI training and inference modules. Raw data are gathered from various shopfloors, each of them with its own protocols and information models, and pre-processed to unify their formats, verify and improve their quality before feeding the AI algorithms.
Nowadays, data can exist in three states: in transit (M2M communications through a digital network), at rest (kept in a digital storage) and in use (used from a digital processing system). The continuous growth of potential attackers and security threats affecting all these three steps, combined with the increased value of the datasets, brings strong attention to the security mechanisms put in place to guarantee the privacy and the validity of the data across their entire lifecycle. KnowlEdge applies role-based access control mechanisms and regulates the distribution and storage of data in trusted environments with different levels of security, depending on their nature and value. The data collected from the shopfloors are securely maintained in the edge environment at the factory premises, publicly inaccessible, and will continuously feed via secure communications the knowlEdge data storage at the fog and cloud domains, where the AI engines’ processes are executed.