ABSTRACT

Fog computing is emerging as a technology that shifts data processing, analytics, storage, and networking closer to the host devices. Fog computing gives IoT systems an advantage, as it enables real-time data processing and decision-making. Sensors in IoT systems collect data that might diminish in value without further computation, making real-time data analysis and processing critical. A cloud-based, centralized approach for data analysis increases latency, which slows the necessary data-driven decision-making process. In an IoT-like time-critical system, fog computing is utilized with machine learning to enable organizations or systems to make data-driven decisions.

As sensors in the IoT environment collect colossal amounts of data, transferring all that data into the cloud for processing slows down the process while consuming greater bandwidth and making a case for fog computing stronger. All this data to be processed in the cloud now needs to be processed on the fog nodes. Fog nodes are where the machine learning algorithms come in, which take in the data and give out actionable results. The following sections look at different problems solved using machine learning in fog systems and their approaches. Then, we further list the advantages and disadvantages of using machine learning in fog systems. Finally, we also look at some real-world use cases of machine learning used with fog systems.