As the IoT/IIoT quickly expands into the Internet of Everything, new demands are placed on the existing cloud-based model. With today’s centralized cloud, concerns arise as billions of connected devices flood both private and public networks with seemingly an infinite amount of raw data. The industry is quickly realizing that the cloud-centric model must evolve to meet the growing needs of businesses.

by Warren Kurisu, Director of product management. Mentor Graphics

What is the answer? Fog computing, a decentralized architectural model that pushes the intelligence of data processing out of the cloud and brings compute resources and application services closer to the ground, or to the edge of the IoT network. Fog computing addresses the demand for more high-speed processing and analytics and improves overall network/system responsiveness. The fog also addresses security issues, a major concern within the IoT.

This article discusses the IoT market landscape and the fog infrastructure, and how the motive, means, and opportunity exist today for businesses to make a strategic shift to Fog Computing. Also discussed is a technical architecture for a fog implementation, as well as the requirements for a smart device to successfully participate in the fog strategy.

A Rapidly Changing Landscape

It wasn’t too many years ago that the terms “cloud” and “IoT” started to permeate our vocabulary. It’s amazing how quickly we are now facing the challenge of evolving our infrastructure to handle the Internet of Everything. IHS forecasts that the installed base of connected things will grow from 15.4 billion devices in 2015, to 30.7 billion in 2020, and 75.4 billion in 2025. For perspective, it’s interesting to compare that number to the projected world population (Figure 1).

Although these statistics account for everything that is connected, including mobile phones and computers, the massive growth will come from other “things” such as connected cars, smart homes, smart grids, wearables, industrial equipment, medical equipment, and anything else that can be connected and collect data.

But, it’s not just about the devices – it’s also about the data. According to Cisco Systems, the amount of data created by these devices in 2015 was 145 zettabytes (ZB), and will reach 600 ZB by 2020. (Note: a zettabyte is one billion terabytes or 1021 bytes.) With regards to IoT and cloud, at least one part of the problem is very obvious. If all of the data storage, processing, and analytics were to be cloud-based, the sheer amount of data that would need to be transmitted would choke the networks. There are other problems as well. If all decision making were to happen in the cloud, the latencies would be too high for any real-time decision making; how many network hops exist between your device and cloud? Also, the costs of transport could also be an issue; how many dedicated connections would a business require to ensure connectivity? Finally, reliability and security are also concerns; how do I ensure fast failover and how do I protect critical information?

Fog Computing: A Working Definition

Before discussion continues further, it might be useful to define the term Fog Computing. As you might imagine there are many and varying descriptions. One very succinct definition comes from a recent IEEE publication entitled “Fog Computing: Helping the Internet of Things Realize its Potential.” In this article, the author describes Fog computing as “A distributed paradigm that provides cloud-like services to the network edge. It leverages cloud and edge resources along with its own infrastructure. In essence, the technology deals with IoT data locally by utilizing clients or edge devices near users to carry out a substantial amount of storage, communication, control, configuration, and management. The approach benefits from edge devices’ close proximity to sensors, while leveraging the on demand scalability of cloud resources” (Figure 2).

A Fog Computing approach provides the following benefits:

  • Lower latencies: fewer network hops means that time-sensitive data analytics and system responses can be executed within appropriate time constraints.
  • Managed Bandwidth: local processing reduces the core network load.
  • Increased Reliability: systems close to the edge can be designed for fast failover.
  • Storage Management: data can be stored in the most appropriate place, and only critical information from the fog would need to be sent to the cloud.
  • Is the Embedded Industry at an Inflection Point?

    Is the industry at an inflection point – with all the billions of devices and zettabytes of data swirling about? At the recent ARC Industry Forum in Orlando, Florida, where visionaries and leaders from the world’s leading industrial companies came together to discuss current and future trends and issues, a few key takeaways emerged:

  • Businesses are now at various stages of implementing their cloud strategies. They range from startups who have built brand new infrastructure, to those who are still trying to figure out how to get data from their brownfield devices.
  • There is a lot of data to be distributed and analyzed – and a desire to gather even more to progress the field of advanced analytics.
  • Security is top of mind. All participants are concerned about how to implement security in this world of connected devices.
  • Overall, it appears the table is now set for advancement. Companies have the motive, means, and opportunity to advance their cloud-based architectures.

    Motive: For competitive reasons, businesses are now defining new business models that change the rules of the game. This could include 3D printing a car in a local factory, or converting a business from selling a product to that of selling a service.

    Means: By leveraging high-performance connectivity, increasing compute power, and implementing security technologies, businesses can integrate powerful new devices into their factories or into their systems to effectuate these new business models.

    Opportunity: As the business and technology landscapes rapidly evolve and infrastructures are being upgraded, businesses can implement strategies to take full advantage of the capabilities and standardization that enable the IoT and Fog Computing.

    Fog Computing: Key Requirements

    The overall Fog Computing architecture is feature-rich. After all, the concept of Fog Computing is to bring cloud-like services to the network edge. Figure 3 illustrates what a fog architecture includes.

    The various layers of the fog architecture ensure that data storage, processing, and analysis occur at the most appropriate place in the infrastructure, to ensure that requirements are satisfied relating to bandwidth, latency, reliability, and scale.

    How Mentor Enables Fog Computing

    At Mentor Graphics our strength and depth of experience lies within the lower two layers of the visual seen in Figure 3. These layers are enabled by Mentor’s industry-leading embedded portfolio, designed and developed to enable world-class edge devices and gateways.

    From the perspective of these two lower layers; Mentor meets critical key requirements by providing software tools and runtime environments that are:

  • Scalable: IoT sensors range from tiny, battery-powered devices with basic processing capabilities and connectivity to more fully-featured Linux-based devices and gateways, each with the ability to scale data storage and processing as required by the fog architecture. Today’s designs are now consolidating edge functionality on complex, heterogeneous System on Chip (SoC) architectures, with a mix of real-time operating systems (RTOS) and Linux capabilities.
  • Connected: These devices must be able to connect “east/ west” to the network of connected devices and “north” to the higher layers in the system, and directly to the cloud. Ethernet and wireless (Wi-Fi, Bluetooth, etc.) are a must, along with support for industry-specific protocols such EtherCAT, OPC-UA, and Data Distribution Service (DDS). Cloud protocols including HTTP, MQTT, and CoAP are also required.
  • Secure: Mentor’s platforms can enable security from pow er-on, authenticating every bit of code that gets subsequently loaded and executed on the system. This enables security of data at rest, data in use, and data in motion. Securing a system in this manner provides assurances that the data in the fog architecture can be trusted.
  • It’s all About the Smart Device

    It’s been noted that the Industrial IoT (IIoT) begins at the smart device level. These devices, which must be scalable, connected, and secure, are the basis on which cloud and fog architectures are built.

    One such example, demonstrated by Mentor at last year’s ARM Technology Conference, is a distributed medical application. The demo consisted of a patient monitor, which aggregated and processed data from a distributed set of sensors collecting patient electrocardiogram (ECG), blood pressure, and pulse information (Figure 4). The data communication was enabled by Real Time Innovations’ Connext DDS integrated with both Mentor Embedded Linux and Nucleus RTOS. This distributed data can be captured, stored, and analyzed locally, and used to generate real-time patient alarms and events. Critical patient information could then be sent up to the cloud for remote monitoring, clinic access, or advanced analytics.

    Conclusion

    As the number of connected devices and data grows exponentially; solutions are required to ensure that data storage, data transmission, analytics and system response is optimized from the edge device to the cloud. We are now in a transition where businesses are moving from a planning phase to implementing their cloud strategies where Fog Computing is quickly gaining favor among various industries and businesses. Mentor Embedded has spent decades building an industry-leading portfolio which can be leveraged to build smart devices that enable a cloud and fog strategy, and address some of the issues that today’s businesses are facing.

    Author Bio:
    Warren Kurisu is the director of product management in the Mentor Graphics Embedded Systems Division, overseeing the embedded runtime platform business for the Nucleus RTOS, Mentor Embedded Linux, virtualization and multicore technologies, safety certified runtimes, graphics and development tools. Warren has spent nearly 30 years in the embedded industry, both as an embedded developer and as a business executive, working broadly in industries including aerospace, networking, industrial, medical, automotive, and consumer. Warren holds a master’s degree in Electrical Engineering from the University of Southern California and a Master of Business Administration from the University of California at Berkeley.

    www.mentor.com