The Edge Case: Redux

For years now, businesses have migrated a full cloud solution; if not in the process of adopting a hybrid system at the very least. The continual building out of data centers by the major tech companies coupled with low power consumption chips from the likes of ARM and TSMC have created efficiencies in power on-site, computation, and rendering back to a customer or client's PC. In this case, I will refer to the PC as anything from a desktop, laptop, tablet, phone, or other similarly purposed device. The maturing of the cloud and cloud migration is among us, at the very least, capabilities to begin utilizing its full power.

Forrester predicts the public cloud market will experience a growth decline from 42% in 2018 to 24% in 2022 due to market maturation. In its place will be an explosion of growth in edge computing, meaning more growth for companies that have invested in cloud-like solutions for edge computing and content delivery, not centralized data centers.

Source: TechRepublic

At this point in time, Gartner states that only 10 percent of enterprise data is processed outside local devices, but by 2025, that number will grow to 75 percent.

Early Edge Computing

In an ironic multi-decade turn of events, edge computing is how the history of computing began before the advent and adoption of the Personal Computer. Mainframes and ARPANET were once privy only to the U.S. Government and research universities. As needs evolved, and companies like Intel mass produced the microprocessor and Microsoft called for a "PC on every desk". It seems that these trends are reverting to the old-fashioned way of conducting business.

Fast-forwarding to the 90s, Akamai, an "Internet backbone" company as I like to refer to them, developed methods to distribute Internet traffic throughout various servers, allowing for distributed computing. The modern data centers began to crop up, now managed by the likes of Amazon, Microsoft, Google, Oracle, and countless others.

The client-server relationship continues to this day, however, given the vast amounts of data, IoT/always connected devices, and security concerns, central controls are only going to grow in popularity and as business demands it.

Current Edge Computing

Traditional managed software is slowly becoming deprecated. An instance of Windows 10, for example, will not need to be downloaded to a lower-power PC. The computation and rendering will take place in the cloud, and sent back to these low power consumption devices. The efficiency of chips, such as Nvidia and Qualcomm, and Apple's ARM chips will continue to undergo Moore's Law. The costs, and TPM, however, will become parabolic and the cost of what I'll call "consumption" devices will become extremely inexpensive, to the benefit of business purchases.

Examples already exist today such as streaming video and music through phones, tablets, and Smart TVs. A Chromecast Ultra, for example, can also be utilized to stream Google's Stadia gaming platform through it. All the rendering of a purchased game is completed through Google's cloud. Early reviews of the latency, ping, and performance have been positive so far. As Windows Central states, Stadia games on the precipice of being played anywhere a "client" device is located; a TV, browser, media stick, phones, etc.

As I mentioned earlier, security is probably a major reason why business and consumers alike are slowly adopting edge computing. On the business level, permissions can be set for all employees with a click, and as WEI states, the complex issue of data sovereignty is made much simpler. Considering the SolarWinds hack that infiltrated a number of U.S. government sites, and servers of Corporate America, the need for security and mitigation techniques and training will only climb from here, given at this point, the SolarWinds hack may be the largest in recorded history.

The Future of Edge Computing

This trend of distributed computing is becoming more necessary as more of our everyday gadgets are connected to the Internet. As more bandwidth is coming online globally with initiatives such as Elon Musk's Starlink, and Facebook's undersea cable buildouts, the network is getting crowded but at the same time, smarter and more efficient. This is thanks to Artificial Intelligence and Machine Learning. To be "intelligence" so to speak with bandwidth and throughputs, these IoT chips have to be disintermediated, but yet able to communicate with other items on the same network.

Neural networks are among the next trends to watch for the edge. They are a type of Deep Learning that gives AI the tools to help solve their own problems, or for example, compensate for their own errors or shortcomings. A self-driving vehicle will be extraordinarily expensive to purchase if its own server were on board, and impractical. A siloed server cannot communicate with other vehicles on the road, or the road itself. They must all be interoperable and always communicating.

These vehicles of the future must either be connected to the cloud utilizing 5G, a tool such as Starlink, or portable container data centers distributed like cell towers of today, in order to take advantage of redundancy and allow each vehicle's neural network to learn and compensate using its surroundings.

This trend is not an end all, be all. It will always be evolving with lower latency, faster connections, smarter chips, and as always, the lowest power consumption possible:

However, in the era of edge computing, deploying deep neural networks on mobile edge platforms are challenging due to long latency and huge computational cost. As previous research efforts were usually focused on accuracy, achieving the balance between computational consumption and accuracy is a more significant problem to be tackled in mobile edge computing domain. 

Source: Journal of Systems Architecture (97) 2019

Edge computing must be a trend considered by all businesses, governments, and consumers. Learning about its benefits is paramount to enter the next stage of computing for financial, security, and economical longevity.