By Rebecca O’Day and Norman Quesnel
Senior Members of Marketing Staff
Advanced Thermal Solutions, Inc. (ATS)
Expanding the Internet of Things (IOT) into time-critical applications such as with autonomous vehicles, means finding ways to reduce data transfer latency. One such way, edge computing, places some computing as close to connected devices as possible. Edge computing pushes intelligence, processing power and communication capabilities from a network core to the network edge, and from an edge gateway or appliance directly into devices. The benefits include improved response times and better user experiences.
While cloud computing relies on data centers and communication bandwidth to process and analyze data, edge computing provides a means to lay some work off from centralized cloud computing by taking less compute intensive tasks to other components of the architecture, near where data is first collected. Edge computing works with IoT data collected from remote sensors, smartphones, tablets, and machines. This data must be analyzed and reported on in real time to be immediately actionable. [1]
In the above edge computing scheme, developed by Inovex, the layers are described as follows:
Cloud: On this layer compute power and storage are virtually limitless. But, latencies and the cost of data transport to this layer can be very high. In an edge computing application, the cloud can provide long-term storage and manage the immediate lower levels.
Edge Node: These nodes are located before the last mile of the network, also known as downstream. Edge nodes are devices capable of routing network traffic and usually possess high compute power. The devices range from base stations, routers and switches to small-scale data centers.
Edge Gateway: Edge gateways are like edge nodes but are less powerful. They can speak most common protocols and manage computations that do not require specialized hardware, such as GPUs. Devices on this layer are often used to translate for devices on lower layers. Or, they can provide a platform for lower-level devices such as mobile phones, cars, and various sensing systems, including cameras and motion detectors.
Edge Devices: This layer is home to small devices with very limited resources. Examples include single sensors and embedded systems. These devices are usually purpose-built for a single type of computation and often limited in their communication capabilities. Devices on this layer can include smart watches, traffic lights and environmental sensors. [2]
Today, edge computing is becoming essential where time-to-result must be minimized, such as in smart cars. Bandwidth costs and latency make crunching data near its source more efficient, especially in complex systems like smart and autonomous vehicles that generate terabytes of telemetry data. [3]
Besides vehicles, edge computing examples serving the IoT include smart factories and homes, smartphones, tablets, sensor-generated input, robotics, automated machines on manufacturing floors, and distributed analytics servers used for localized computing and analytics.
Major technologies served by edge computing include wireless sensor networks, cooperative distributed peer-to-peer ad-hoc networking and processing, also classifiable as local cloud/fog computing, distributed data storage and retrieval, autonomic self-healing networks, remote cloud services, augmented reality and virtual reality. [5]
Autonomous Vehicles and Smart Cars
New so-called autonomous vehicles have enough computing hardware they could be considered mobile data centers. They generate terabytes of data every day. A single vehicle running for 14 to 16 hours a day creates 1-5TB of raw data an hour and can produce up to 50TB a day. [6]
A moving self-driving car, sending a live stream continuously to servers, could meet disaster while waiting for central cloud servers to process the data and respond back to it. Edge computing allows basic processing, like when to slow down or stop, to be done in the car itself. Edge computing eliminates the dangerous data latency.
Once an autonomous car is parked, nearby edge computing systems can provide added data for future trips. Processing this close to the source reduces the costs and delays associated with uploading to the cloud. Here, the processing does not occur in the vehicle itself.
Other Edge Computing Applications
Edge computing enables industrial and healthcare providers to bring visibility, control, and analytic insights to many parts of an infrastructure and its operations—from factory shop floors to hospital operating rooms, from offshore oil platforms to electricity production.
Machine learning (ML) benefits greatly from edge computing. All the heavy-duty training of ML algorithms can be done on the cloud and the trained model can be deployed on the edge for near real-time or true real-time predictions.
For manufacturing uses, edge computing devices can translate data from proprietary systems to the cloud. The capability of edge technology to perform analytics and optimization locally, provides faster responses for more dynamic applications, such as adjusting line speeds and product accumulation to balance the line. [8]
Edge Computing Hardware
Processing power at the edge needs to be matched to the application and the available power to drive an edge system operation. If machine vision, machine learning and other AI technologies are deployed, significant processing power is necessary. If an application is more modest, such as with digital signage, the processing power may be somewhat less.
Intel’s Xeon D-2100 processor is made to support edge computing. It is a lower power, system on chip version of a Xeon cloud/data server processor. The D-2100 has a thermal design point (TDP) of 60-110W. It can run the same instruction set as traditional Intel server chips, but takes that instruction set to the edge of the network. Typical edge applications for the Xeon D-2100 include multi-access edge computing (MEC), virtual reality/augmented reality, autonomous driving and wireless base stations. [10]
Thermal management of the D-2100 edge focused processor is largely determined by the overall mechanical package the edge application takes. For example, if the application is a traditional 1U server, with sufficient air flow into the package, a commercial off the shelf, copper or aluminum heat sink should provide sufficient cooling. [11]
An example of a more traditional package for edge computing is the ATOS system shown in Figure 6. But, for less common packages, where airflow may be less, more elaborate approaches may be needed. For example, heat pipes may be needed to transport excess processor heat to another part of the system for dissipation.
One design uses a vapor chamber integrated with a heat sink. Vapor chambers are effectively flat heat pipes with very high thermal conductance and are especially useful for heat spreading. In edge hardware applications where there is a small hot spot on a processor, a vapor chamber attached to a heat sink can be an effective solution to conduct the heat off the chip.
The Nvidia Jetson AGX Xavier is designed for edge computing applications such as logistics robots, factory systems, large industrial UAVs, and other autonomous machines that need high performance processing in an efficient package.
Nvidia has modularized the package, proving the needed supporting semiconductors and input/output ports. While it looks like if could generate a lot of heat, the module only produces 30W and has an embedded thermal transfer plate. However, any edge computing deployment of this module, where it is embedded into an application, can face excess heat issues. A lack of system air, solar loading, impact of heat from nearby devices can negatively impact a module in an edge computing application.
Nvidia considers this in their development kit for this module. It has an integrated thermal management solution featuring a heat sink and heat pipes. Heat is transferred from the module’s embedded thermal transfer plate to the heat pipes then to the heat sink that is part of the solution.
For a given edge computing application, a thermal solution might use heat pipes attached to a metal chassis to dissipate heat. Or it could combine a heat sink with an integrated vapor chamber. Studies by Glover, et al from Cisco have noted that for vapor chamber heat sinks, the thermal resistance value varies from 0.19°C/W to 0.23°C/W for 30W of power. [16]
A prominent use case for edge computing is in the smart factory empowered by the Industrial Internet of things (IIoT). As discussed, cloud computing has drawbacks due to latency, reliability through the communication connections, time for data to travel to the cloud, get processed and return. Putting intelligence at the edge can solve many if not all these potential issues. The Texas Instruments (TI) Sitara family of processors was purpose built for these edge computing machine learning applications.
Smart factories apply machine learning in different ways. One of these is training, where machine learning algorithms use computation methods to learn information directly from a set of data. Another is deployment. Once the algorithm learns, it applies that knowledge to finding patterns or inferring results from other data sets. The results can be better decisions about how a process in a factory is running. TI’s Sitara family can execute a trained algorithm and make inferences from data sets at the network edge.
The TI Sitara AM57x devices were built to perform machine learning in edge computing applications including industrial robots, computer vision and optical inspection, predictive maintenance (PdM), sound classification and recognition of sound patterns, and tracking, identifying, and counting people and objects. [18,19]
This level of machine learning processing may seem like it would require sophisticated thermal management, but the level of thermal management required is really dictated by the use case. In development of its hardware, TI provides guidance with the implementation of a straight fin heat sink with thermal adhesive tape on its TMDSIDK574 AM574x Industrial Development Kit board.
While not likely an economical production product, it provides a solid platform for the development of many of the edge computing applications that are found in smart factories powered by IIoT. The straight fin heat sink with thermal tape is a reasonable recommendation for this kind of application.
Most edge computing applications will not include a lab bench or controlled prototype environment. They might involve hardware for machine vision (an application of computer vision). An example of a core board that might be used for this kind of application is the Phytec phyCORE-AM57x. [21]
Machine vision being used in a harsh, extreme temperature industrial environment might require not just solid thermal management but physical protection as well. Such a use case could call for thermal management with a chassis. An example is the Arrow SAM Car chassis developed to both cool and protect electronics used for controlling a car.
Another packaging example from the SAM Car is the chassis shown below, which is used in a harsh IoT environment. This aluminum enclosure has cut outs and pockets connecting to the chips on the internal PCB. The chassis acts as the heat sink and provides significant protection in harsh industrial environments.
Edge computing cabinetry is small in scale (e.g. less than 10 racks), but powerful in information. It can be placed in nearly any environment and location to provide power, efficiency and reliability without the need for the support structure of a larger white space data center.
Still, racks used in edge cabinets can use high levels of processing power. The enclosure and/or certain components need a built-in, high-performance cooling system.
Hardware OEMs like Rittal build redundancy into edge systems. This lets other IT assets remain fully functional and operational, even if one device fails. Eliminating downtime of the line, preserving key data and rapid response all contribute to a healthier bottom line.
Although edge computing involves fewer racks, the data needs vital cooling protection. For edge computers located in remote locations, the availability of cooling resources may vary. Rittal provides both water and refrigerant-based options. Refrigerant cooling provides flexible installation, water based cooling brings the advantage of ambient air assist, for free cooling. [25]
LiquidCool’s technology collects server waste heat inside a fluid system and transports it to an inexpensive remote outside heat exchanger. Or, the waste heat can be re-purposed. In one IT closet-based edge system, fluid-transported waste heat is used for heating an adjacent room. [26]
Green Revolution Cooling provides ICEtank turnkey data centers built inside ISO shipping containers for edge installations nearly anywhere. The ICEtank containers feature immersion cooling systems. Their ElectroSafe coolant protects against corrosion, and the system removes any need for chillers, CRACs (computer room ACs) and other powered cooling systems. [27]
A Summary Chart of Suggested Cooling for Edge Computing
The following chart summarizes air cooling options for Edge Computing applications:
The Leading Edge
The edge computing marketplace is currently experiencing a period of unprecedented growth. Edge market revenues are predicted to expand to $6.72 billion by 2022 as it supports a global IoT market expected to top $724 billion by 2023. The accumulation of IoT data, and the need to process it at local collection points, will continue to drive the deployment of edge computing. [28,29]
As more businesses and industries shift from enterprise to edge computing, they are bringing the IT network closer to speed up data communications. There are several benefits, including reduced data latency, increased real-time analysis, and resulting efficiencies in operations and data management. Much critical data also stays local, reducing security risks.
References
- https://www.networkworld.com/article/3224893/what-is-edge-computing-and-how-it-s-changing-the-network.html
- https://www.inovex.de/blog/edge-computing-introduction/ https://www.datacenterknowledge.com/edge-computing/searching-car-network-s-natural-edge
- https://www.bloomberg.com/news/articles/2019-06-17/ai-needs-edge-computing-to-make-everyday-devices-smarter
- https://www.networkcomputing.com/networking/how-edge-computing-compares-cloud-computing
- https://medium.com/velotio-perspectives/a-beginners-guide-to-edge-computing-6cfea853aa11
- https://www.datacenterknowledge.com/edge-computing/searching-car-network-s-natural-edge
- https://www.wespeakiot.com/will-edge-computing-devour-cloud/
- https://www.designnews.com/automation-motion-control/edge-computing-emerges-megatrend-automation/27888481159634
- https://www.design-reuse.com/news/45423/xilinx-baidu-brain-edge-ai-edgeboard.html
- https://www.intel.com/content/www/us/en/products/docs/processors/xeon/d-2100-brief.html
- https://software.intel.com/en-us/articles/intel-xeon-processor-d-2100-product-family-technical-overview
- https://atos.net/en/2019/press-release/general-press-releases_2019_05_16/atos-launches-the-worlds-highest-performing-edge-computing-server
- https://venturebeat.com/2012/09/11/this-coke-machine-has-an-intel-core-i7-processor-and-it-can-take-your-picture/
- https://www.custompcreview.com/news/nvidia-announces-jetson-x2-edge-computing-platform/
- https://developer.nvidia.com/embedded/jetson-agx-xavier-developer-kit#resources
- “Glover, G., Chen, Y., Luo, A., and Chu, H., “Thin Vapor Chamber Heat Sink and Embedded Heat Pipe Heat Sink Performance Evaluations”, 25th IEEE Symposium, San Jose, CA USA 2009.
- http://www.ti.com/tool/SITARA-MACHINE-LEARNING#descriptionArea
- https://www.mathworks.com/discovery/machine-learning.html
- http://www.ti.com/tool/SITARA-MACHINE-LEARNING#descriptionArea
- http://www.ti.com/tool/TMDSIDK574
- https://www.phytec.com/phytec-announces-a-new-system-on-module-som-based-on-the-new-sitara-am57x-processor-family-from-texas-instruments/
- http://processors.wiki.ti.com/index.php/File:PhyCORE-AM57x_SOM.jpg
- https://www.qats.com/cms/2017/10/09/ats-collaborates-sam-car-featured-cnbc-program-jay-lenos-garage/
- https://www.custompcreview.com/news/nvidia-announces-jetson-x2-edge-computing-platform/
- https://www.rittal.us/contents/edge-computing-and-uncontrolled-environments/
- https://www.liquidcoolsolutions.com/edge-server/#single/null
- https://www.grcooling.com/edge-computing/
- https://blog.apc.com/2019/05/15/four-reasons-configure-to-order-rack-pdus-edge-computing-environments/
- https://www.techrepublic.com/article/edge-computing-the-smart-persons-guide/