Formerly known as Wikibon
Search
Close this search box.

The Vital Role of Edge Computing for IoT: 2017 Update

Premise

We’ve updated our Business Value of Technology: Edge IoT Computing with new data. It confirms our belief: IoT Edge needs will greatly influence future architectures and infrastructure priorities.

Executive Summary

AT&T and AWS recently announced a new strategic relationship. In one aspect of this relationship, they will coordinate introduction of AT&T IoT-connected sensors and devices preconfigured to seamlessly and more securely send data into the AWS Cloud. AT&T offers a set of IoT technologies which aim at putting cell-phone technology in each sensor, and linking it with a cellular network to multiple cloud services including AWS.

Wikibon’s previous research analyzes the Hybrid IoT Model, where the majority of data (95%+) is processed and analyzed at the Edge, and summary or critical data is transferred to a central private or public cloud. The research compared this to the AWS approach where all the data was sent directly to the public cloud to be processed. The research shows, especially for Industrial IoT, that although the processing cost is lower by using the cloud, the communication costs dominate the overall analysis. A Hybrid IoT Model is significantly lower cost and functionally better.

The AT&T cellular solution in the first paragraph has the potential of lowering the communication costs. An additional scenario using cellular has been added to the previously researched scenarios. The updated research analysis shows this cellular approach is a low-cost way of linking up a few sensors, and reduces the communication costs significantly. However, the overall solution is still about 3 times more expensive than a Hybrid IoT solution with the majority of the data being held and processed at the Edge, and a small percentage (5%) being sent to the cloud. In addition, the cellular solution offers less availability, reliability and performance than the other two scenarios.

This Business Value of Technology research report also looks at the technology changes that are projected to happen over the next few years, and assesses the impact on IoT solutions. There are a number of communication vendors that are looking to virtualize, simplify and reduce the cost of moving data, including the AT&T NetBond® service AWS will integrate with. However, almost all the other technology trends point to communication costs continuing to dominate and get comparatively bigger in the future. Even more important, the relative elapsed-time penalty of moving data to the cloud and back again to the Edge will get worse; Wikibon expects that more processing will be moved to the Edge over time.

Wikibon confirms our previous conclusions that Edge computing will be a vital component of IoT, and the cost benefits and business benefits of using a Hybrid IoT Model will increase in time for most IoT environments. This research looks at some classes of IoT where that there will be greater use of centralized cloud, but still concludes that for almost all IoT cases some Edge pre-processing and data reduction will reduce the overall cost of IoT. In most IoT cases, moving the processing to the data will result in significantly greater function and lower costs, with the requirement for more complex distributed software.

It should be noted that traditional hybrid clouds are very different from the IoT Hybrid Cloud. This research does not seek to evaluate all models of hybrid cloud, only IoT hybrid cloud.

Mobile Is Better, But Doesn’t Take Away the Edge

In a 2015 case study, Wikibon modeled a wind-farm IoT site with 100 sensors and 2 video streams. The research showed that moving all IoT sensor data to the cloud from every device was much more expensive than processing the data on site. The key element of cost was data communication from the IoT site to the cloud. AT&T have recently announced the adding AT&T SIM cards and equipment to sensors to send data over the cellular internet to the AWS cloud, in addition to previous agreements with IBM-Watson Bluemix and Microsoft Azure. This is currently offered by AT&T as an AT&T IoT Starter Kit, together with M2X (a time-series database) and AT&T Flow (connectivity from SIM to Cloud) and IoT Data Plans are offered to provide an end-to-end solution. AWS & AT&T have agreed to coordinate AT&T IoT-connected sensors and devices to operate seamlessly with the AWS Cloud.

Wikibon has updated the 2015 analysis with this new approach. This adds a SIM-card and “phone-like” capability to each sensor, so that it can be individually identified, and moves the data from each sensor to the AWS cloud over the AT&T cell-phone network. The IoT data plan has the choice of sending the data directly or as SMS messages. The SMS message method is easier to set up, but would cost over $1.9 million over 3 years to send each message as a SMS message! The direct data transfer costs over the cellular network are significantly lower cost than a dedicated line network, but also with lower quality and reliability.

Figure 1 shows an update to the previous research analysis, with three scenarios:

  1. The original scenario with 100 sensors and 2 low-quality videos connected by a dedicated line to the AWS IoT management with a time-series database in the cloud.
  2. The new scenario of 100 sensors and 2 low quality videos with a AT&T SIM card and “phone-like” hardware for each sensor ($800/sensor) and video-stream connected to a cellular network to the AWS cloud IoT management with a time-series database.
  3. The original Pivot3 “Edge” server communicating and managing the 100 sensors and high-quality video locally with a time-series database, with 5% of the data being transmitted over a direct connection to the AWS cloud.

Figure 1 shows that dedicated line costs dominate in Scenario 1 (red in the first column). Scenario 2 significantly lower overall cost by lowering the cost of communication by going to cellular (smaller red), but a very high increases the onsite costs from individual hardware required for each sensor.  Scenario 3 is the lowest cost, because it reduces both the communication costs and the onsite costs through sharing the hardware across all of the sensors. In addition, the direct connection is significantly more reliable, more performant and allows much greater security.

Chart 2017 V3
Figure 1 – Comparison of Total 3-year Management & Processing Costs of AWS Cloud + Dedicated Network vs. SIM Hardware + AT&T Cellular Network + AWS Cloud vs. Edge + AWS Cloud + Dedicated Network with 95% Edge Data Reduction (200 Miles)
Source © Wikibon 2017. See Table 1 for details of cloud components and assumptions.

All the calculations and assumptions in the three scenarios are shown in Table 1 below. The result of these calculations is shown in Figure 1 above, with the  Edge computing option at 15% of the Cloud-only Processing and 33% of the AT&T Cellular approach to feeding data to the cloud. The AT&T cellular network is also less reliable, less performant and less secure than a dedicated network.

In summary, a cloud only strategy  (first vertical bar) will be dominated by the cost of data movement (in red) and impeded by the elapsed time and uncertainty of delivery of data to and from the cloud. The second column represents the cellular approach, which reduces the cost of data transfer, but adds very significant cost to the edge equipment, and is a much weaker communication system.

Conclusion: In this scenario and other like it, the only cost-effective and reliable choice is to use a hybrid cloud approach, with fully functional “Edge” computing.

Case Study Methodology

Table 1 shows three sets of costs for each scenario:

  1. Edge data transport costs to the Cloud
  2. Cloud Processing Costs
  3. On-site Equipment Costs
Edge Data Transport

The cost of the dedicated network to transport the data to the cloud is calculated from the total gigabytes per month, the number of miles, and the cost of the network expressed as $/gigabyte/month/mile. The total cost over three years is calculated as $249,000.

The costs of the dedicated network for the second column is 5% of the first column, as the assumption is that 95% of the data is processed locally. The used of Edge computing lowers this cost to $12,500 over three years.

The cost of the cellular network is calculated from the Data Plan offering $5 gigabytes for $100. The SMS portion of this offering is not used, as there would be over 32 million SMS messages over 3 years, at a total cost of about $1.9 million! The total cost for cellular data transport is $29,000 over three years. This is much cheaper than the dedicated network, but the reliability, performance and security are significantly worse.

The Cloud Costs

The AWS costs are based on the number of messages received, and are a relatively very low cost compared with the network costs. Again, the number of messages in the Edge processing option (middle column) is only 5% of the Cloud-only processing option, and is therefore 5% of the cost.

On-site Equipment Costs

The on-site equipment costs are to enable the data to flow from a central point on the site, process the data if necessary, and send the necessary data to the cloud.

  • The requirement for the cloud-only solution is simple and costed at $2,000 over 3 years, including maintenance.
  • The cost of the Pivot3 server system is costed at $25,000 over 3 years, including maintenance, and off-site support.
  • The cost of the SIM manager is costed at $800 for each sensor, equivalent to a industrial cellphone and software.

The total 3-year costs are shown in the last row, and are shown in Figure 1 above.

2017 Table V4
Table 1 – Cost Comparison of Cloud-only Management vs. Edge + AWS Cloud Processing vs. AT&T Cellular Network & AWS Cloud Processing
Source © Wikibon 2017

AT&T IoT Technology Analysis & Conclusions

AT&T claim to have 29 million devices connected to its network, of which 9 million are vehicles. The AT&T technology is a good fit for monitoring a single sensor, especially for monitoring a mobile device such as a truck. If the sole requirement is to know where a mobile asset is, or for a single or very few sensors distributed in areas with good cellular connections, the AT&T cellular solution should be considered.

However, the results of the analysis in Figure 1 above demonstrate very clearly that the AT&T cellular approach does not scale as the number of sensors at an end-point scales, and will not be widely adopted for heavy IoT use. This update shows that Edge computing scales much better than either a directly attached Cloud-only approach or a Cellular network approach. In addition, the cellular approach also has many more components and failure points. The lack of air gaps in the architecture will also be of significant concern to OT designers.

Conclusion: The use of cellular data will be used in many solutions, including the car of the future networks. However, it will not be achieved with direct connection to sensors, but rather by consolidation of data into local compute units and efficient multiplexing of data by these compute units to multiple clouds. The cellular network will be a key enabler, but, to rephrase an old phrase – the network will not be the system.

IoT Technology Projections Analysis & Conclusions

Looking forward, the impact of distance on computing in general is increasing. With traditional IT systems with hard magnetic disks (HDDs), IO response time were an average 20 millisecond. Using a rule of thumb of 0.2 milliseconds per mile, that is the equivalent of data in memory 100 miles away. However, the traditional rules are being upended by a number of important technology trends:

  • IO response times have improved to as low as 20 microseconds using flash, a 1,000 x improvement. 200 microseconds is easy to achieve with flash storage and NVMe protocols (100x improvement).
  • These fast IO rates are achievable on Edge systems running Server SAN configurations, at far lower costs than traditional servers and storage arrays.
  • This means that data in the cloud now has to be only 1 mile or less away to compete with the response time from local storage.
  • Initially the small amount of processed IoT data went to human operators, operating with response times measured in seconds. Data could be held or transported across a wide-area network without impacting response time.
  • Increasing much larger amounts of IoT data are being transmitted directly to other compute systems.  The initial system will expect a response back measures in milliseconds or less. The emphasis of systems will be to automate as much as possible, and give data in summary form to human operators. As IoT systems become more complex, the data will need be processed locally.
  • The traditional improvement in cycle time of processors has ground to a halt at about 4GHz. The improvements in system performance are coming from putting the system components much closer together, using parallelism by using more cores per processor, and using parallelism with graphics processors and FPGAs. All of these trends require the data to be local and in large quantities, and the penalty for waiting for remote data become higher and higher.
  • The size of these modern systems is rapidly decreasing, and system space is no longer the constraining metric. Power density will take over as the constraining factor, and new chip and system architectures will focus on power efficiency.
  • Movement of data is expensive and takes a long time. The cost is and will continue to be a function of distance and bandwidth. The figure used in the analysis above is $1/GB/Mile/month. This is projected to reduce by 30%/year, and the bandwidth to improve by 30%/year. Long-distance bandwidth and cost improvement tend to be cyclic, with bigger improvements in cost/bandwidth every few years.
  • At the same time the amount of data from sensors and smart sensors is growing at a much faster rate than the 30% bandwidth factor.
  • The speed of light is still the limiting factor in moving data long distances.
  • The simple math in the case study in Figure 1 clearly concludes that the cost of moving all IoT data from the edge to a cloud not economic. The technology trends above show that it become even less economic over time. Wikibon projects that 95% of IoT data will live and die at the Edge, and this will grow to 99% over the next decade. A strategy to move all IoT data to the “cloud” will be long-term economic suicide for most enterprises! Data scientists will adapt and move the application to the data.
  • It is also clear that Clouds, both private and public, play a vital role in the hybrid IoT systems that must evolve. Subsets of data will be culled from multiple Edge sites and brought to the Clouds for composite analysis.

Conclusion: Almost all the technology trends analyzed above point to communication costs continuing to dominate and get comparatively bigger in the future. Even more important, the relative elapsed-time penalty of moving data to the cloud and back again to the IoT Edge will get higher. Wikibon expects that more processing will be moved to the IoT Edge over time. For most IoT hybrid clouds, the majority of data will stay within the Edge, with only summary and extracts shared with other clouds.

Are There IoT Use Cases for No Edge?

Weather Systems

There are specific cases when the amount of central processing is very high and has to be done in one place. An example is the supercomputing requirements of weather models, where all the data content has to be be available in a single very large compute environment. At the moment the European weather models are the best, so all the data ends up there, and of course the faster and more accurately the models deliver results the greater the economic value of the results. The compute costs are the biggest portion of the total costs.

Conclusion: Edge processing that can reduce the amount of data that has to be transmitted and decrease the transmission times. In addition, Edge processing can ensure that data can be retransmitted in the case of communication failures. Weather data can be significantly aggregated, compressed and standardized by the use of time-series databases and some additional processing close to the edge. This ensures that it arrives faster and at lower cost to it final destination. Overall, the Edge processing for systems with very high central compute costs will be a much lower proportion of the total system cost.

Network Systems

For some network systems, the value of data at the center is greater than the value at the edge. An example is electrical power networks that require specific IoT data usage (not all by any means) to be transmitted to a specific central location, where the data can be processed and monitored to minimize the risk of blackouts across the whole network. As in the first case, transmission can be speeded by aggregation, compression, standardization and retransmission capabilities at the Edge. In addition, there is also a significantly greater amount of data that is mainly of relevance locally.

Some Industrial supply chains will have greater value if there is a central repository of a subset of the data. For example, an understanding of demand and supply for a specific automotive part may help car manufacturers to promote or reduce demand for cars that utilize that part.

Multiple Secure Clouds 

However, there can be also unintended consequences of such centralization of data. In the case of automotive industrial supply chain, financial investors and hedge funds with access to that same data would have an enormous leverage in exploiting that data at the expense of other investors.

In the same way, data being fed back to automative manufacturers from IoT devices in cars could be sold to insurance companies to set rates, or sold by investors to determine the ride-share company performance.

In order to meet the need for need for confidentiality and security, Edge centers will need to be able to set up the capability of to create and tear down distributed connections with other parties and other clouds, to make and verify transactions with limited data sharing with other parties instantaneously and without a central authority. The same technology can be used for sampling data for use developing improved models for deploying at the edge or centrally. Blockchain technologies are a good starting point for the design of such systems, and IBM is a leading vendor and proponent of Blockchain solutions.

In addition, the design of networks must allow clear air-gaps and enable end-to-end encryption at an application or system level, rather than at a network or storage level.

The vast majority of practical machinery generates data that is mainly relevant to that site, with a small percentage needing to be processed centrally. Wikibon projects that 95% of data will never have to leave the edge, and this percentage will grow over the next decade to be 99%.

Conclusion: The biggest challenges for IoT are creating the distributed networks, distributed databases and distributed analytic software that can be managed from a site remote from the Edge, without any local expertise required.

Action Item:

Enterprise executives with responsibility for implementing a long-term strategy for IoT should ensure that the architecture they adopt is a full hybrid IoT cloud solution, capable of managing, storing, processing, protecting and deleting 95% of data at the Edge today, and about 99% of data at the Edge by the end of the strategic project.

Footnote:

Added 12/12/2017: Corrections to the calculations in Table 1 and the addition of Table 2 with a full explanation of the calculations. Many thanks to Lal Chandran of Ericsson in Sweden (lal.chandran@ericsson.com) for his detailed review of the calculations in Tables 1 & 2.

Wikibon has publishing additional research on “Choosing a Partner for Implementing a Hybrid Model of IoT“. 

 

2017 Table2
Table 2 – Details of Calculations for Transmission Costs
Source © Wikibon 2017

 

Book A Briefing

Fill out the form , and our team will be in touch shortly.

Skip to content