The true potential of Industrial IoT can only be achieved through the introduction of Artificial Intelligence. In this article we will go beyond IoT and will focus on Data Analytics and Data exploitation because for us IoT without Big Data is nothing.
When we write about Industrial IoT, at Barbara we compare it to the nervous system of a company, imagining it as a network of sensors that collects valuable information from all corners of a production plant and stores it in a repository for data analysis and exploitation. In these articles, we emphasize the need to measure and obtain data in oorder to make informed decisions.
But what happens next? what should we do with all that data? We always talk about making good decisions based on reliable information, but although it may sound obvious, it is not always that easy to achieve it. In this article we will go a bit beyond IoT and will focus on the data and how to exploit it.
We’ll talk about the analysis phase, the process that turns data into information first and then into knowledge; what some refer to it as business logic. In the end we are not going that far from the area of IoT, because for us IoT without Big Data is meaningless.
In recent decades, especially in the last one, we have witnessed an incredible flood of data (structured and unstructured), mass-produced by the ubiquity of digital technologies. In the particular case of the Industrial world, the value of exploiting this huge amount of information is of great interest.
This need to process business data has given rise to Big Data, Data Science or Data Analytics, which we could define as the processes we follow to examine the data captured by our network of devices, with the aim to reveal hidden trends, patterns or correlations. Always with the underlying idea of improving the business with new types of knowledge.
There are different definitions for Big Data, one of them referred by Gartner considers 3 key aspects such as the volume of data, its variety or the speed with which it is captured when talking about Big Data. These are the so-called 3Vs, although others extend them to 5Vs, adding the veracity of the data and the value they bring to the business.
We believe though, it does not make much sense to go into theoretical disquisitions on what is and what is not, because Big Data is already practically everything.
How do IoT and Big Data relate to each other? The main point of contact is usually a database. In general terms, we could say that the work of IoT ends in that database, i.e. its goal is to dump all the data acquired in a more or less orderly manner in a common repository. The domain of Big Data starts by going back to that repository to play with data and get the information needed.
In any case, it is interesting to visualize this Big IoT Data Analytics as a toolbox. A box from which we will draw one tool or another depending on the type of information and knowledge we want to acquire from the data. Many of these tools are traditional algorithms, or improvements or adaptations of them, with very similar statistical and algebraic principles. Algorithms that were not invented in this century which surprises to many, who wonder why they are now relevant and were not then.
The quick answer is that the volume of data available is now much greater than then, but above all, that the computing power of today’s machines allows the use of these techniques on a larger scale, giving, in a way, new uses to old methodologies.
But neither do we want to give the impression that everything has been invented and that the current trend in data analysis has brought nothing, that is not true. The data ecosystem is very broad and has witnessed a strong innovation in recent years.
One of those fastest growing areas is Artificial Intelligence. Although many will argue that it is not a recent invention, since this phenomenon was discussed as early as 1956. However, Artificial Intelligence is so broad and its impact so big, that it is often considered a self-contained discipline. The reality is that, in some way, it is part of Big Data or Data Analytics. It is another of those tools that was already part of our tool box but found a natural evolution with AIoT.
The exponential growth in the volume of data requires new ways of analyzing it. In this context, Artificial Intelligence becomes particularly relevant. According to Forbes, the two main trends that are dominating the technology industry are precisely the Internet of Things (IoT) and Artificial Intelligence.
IoT and AI are two independent technologies that have a significant impact on multiple verticals. While IoT is the digital nervous system, AI becomes an advanced brain that makes the decisions that control the overall system. According to IBM, the true potential of IoT will only be achieved through the introduction of AIoT.
But what is Artificial Intelligence and how is it different from conventional algorithms?
We usually speak of Artificial Intelligence when a machine (Artificial) mimics the cognitive functions (Intelligence) of humans. That is, it solves problems in the same way as a human would. Or, let´s say that a machine is able to find new ways of understanding data, new algorithms to solve complex problems without the programmer -and this is the key- knowing them i.e. without the programmer programming them. So we could think of Artificial Intelligence and, in particular, Machine Learning (which is the part with the greatest projection within AI) as algorithms that invent algorithms.
The combination of IoT+AI brings us AIoT (Artificial Intelligence of Things), intelligent and connected systems that are able to make decisions on their own, evaluate the results of these decisions and improve over time.
This combination can be done in several ways, of which we would like to highlight two:
1. On the one hand we could continue to speak of that brain as a centralized system that processes all impulses and makes decisions. In this case we would be referring to a system in the cloud that centrally receives all telemetry and acts. It would be Cloud AI (Artificial Intelligence in the Cloud).
2. On the other hand, we must also talk about a very important part of the nervous system: reflexes. Reflexes are autonomous decisions that the nervous system makes without the need to send all the information to the central processor (the brain). These decisions are made in the periphery, close to the source where data was originated. This is called Edge AI (Artificial Intelligence at the Edge).
Edge AI is an emerging area with great potential to bring intelligence to all kind of industrial equipments. Industrial edge solutions are necessary for real-time decision making, enhanced security and high reliability.
Cloud AI provides a thorough analysis process that takes into account the entire system, whereas Edge AI gives us speed of response and autonomy. But as with the human body, these two ways of reacting are not mutually exclusive, but can be complementary.
However, industrial organizations already using AI at the edge report enhanced production quality and much fewer maintenance issues.
As an example, a water control system can block a valve in the field the moment it detects a leak to prevent from major water losses and, in parallel, it notifies to the central system, where higher-level decisions can be made such as opening alternative valves to channel water through another circuit. The possibilities are endless, and can go beyond this simple example of «reactive» maintenance, to be able to predict possible events and thus, enabling a «predictive» maintenance.
Another use case is the deployment of Artificial Intelligence models at the Edge to predict chemical levels in the water supply and purification plant, based on real-time variables in a cybersecure manner. Water operators can now achieve maximum optimisation of the reagent dosing control loops for more precise control of the variables to be monitored andl ower costs due to the reduction in the use of chemicals.
In the cloud all this data capture and transmission is extremely costly, while at the same time entailing increased security risks as it has a larger surface of attack (an extremely important factor in a sector such as water treatment and critical infrastructures). EDGE makes it possible to integrate these algorithms locally for real-time operation, at a low cost and with unbeatable security standards.
With the evolution of Machine Learning and technologies such as the cloud and Edge Computing, these processes have become fully automated.
Another example of AIoT can be found in Smart Grids, where we have smart devices at the edge analyzing the electricity flows at each node and making load balancing decisions locally, while in parallel it sends all this data to the cloud for a more nationwide process. Macroscopic level analysis would allow load balancing decisions to be made at a regional level or even decreasing or increasing electricity production, by shutting down hydroelectric plants or launching a power purchase process from a neighboring country.
Research paper about Edge Computing has exploded in recent years, going from an output of about 2,000 papers annually to more than 25,000, according to Google Scholar. Furthermore, the International Data Corporation (IDC) predicts that the Edge Computing market is set to double in the next 4 years. According to Kevin Scott, CTO of Microsoft, edge intelligence is proving to be the last mile in the convergence of the digital and physical worlds.
While an increasing number of IoT use cases demand a higher degree of edge processing, solutions at the edge are still grappling with the challenges of secure connectivity and application management. This is where Barbara comes in: our edge node platform designed with security by design, enables one-click, centralized deployment, management and configuration of applications on nodes.
If you like to learn more on how to implement AI at the Edge, get in touch.