Edge Computing Today – What to Know

Cloud Infrastructure Architect and Tech Guru, Jim Demetrius, shares the latest on edge computing and what you need to know.

Over the years, we have seen many different types of computing evolve and affect our daily lives and industries. When computers were first developed, they occupied entire buildings, consumed large amounts of electricity and produced great amounts of heat.  The first computers developed in the 1930's were massive machines using vacuum tubes, weighing almost 50 tons and occupying about 1,800 square feet.  Today we hold our smartphones in our hands which can process millions of transactions, unbelievable when you think of the computers used to send men to the moon.  Smartphones today are more powerful than the supercomputers of the 1980's. 

Why is this important or significant? When you consider how much compute power a smart phone has and how much of it is used, that is where it lends itself to the edge compute discussion of the future.

The Future of Edge Computing

Edge computing is the next evolution of computing, allowing businesses of today to take advantage of compute power closer to the actual workload. Instead of having to send the data off for processing at a remote datacenter and then having to wait for the subset of necessary data to return, now we can deploy compute resources closer to the edge where it is needed. This is useful across multiple verticals and industries, including healthcare, manufacturing, automotive and entertainment.

By utilizing edge computing in a manufacturing plant as an example, the machinery that needs to have data processed to take action can now be processed on site or very close to the manufacturing site, allowing subsequent steps to occur quicker. The edge compute processing can alleviate hours to even days of bottlenecks.

There are many providers in the edge compute arena, from the familiar network providers to private cloud and colocation providers. So, let's explore where edge compute is and how it gets deployed to determine the different options. 

Edge compute can be found in many different configurations from dedicated to shared infrastructure, with providers ranging from larger wireless providers, datacenter providers or onsite processing. The requirements to support edge compute devices are power, suitable environment for equipment (temperature, space, power and clean environment) and connectivity. 

Starting with the larger wireless providers, they have many offices, POPs (small to large out buildings) and towers with the capabilities to hold compute infrastructure, allowing them to be closer to end users or applications. Datacenter providers have similar offerings by reaching many different geographic areas with locations for edge compute infrastructure. Finally, there is the facility or office building needing the data processing that can serve as an edge location.  One example could be a warehouse that needs to process a large amount of meta data in order to move packages to the next step in their journey; an edge compute stack could be located at the remote processing facilities themselves. Another example of where edge compute will play a large role is in the arena of autonomous vehicles and the enormous amounts of data they create and needs to process.

Why Process Closer to the Edge?

May21-BLOG-Edge-ComputingBy deploying the edge compute stacks closer to where the work is performed has the obvious benefits of lower latency for applications and data processing. It will allow for businesses to process data quicker, for things like price quoting or accurately managing goods and services, creating a better experience for the end customers. Another benefit of edge computing is the ability to reach businesses previously isolated because of their geographic location and distance from where data is computed. With the innovation of edge compute and connectivity options like 5G, the reach to remote areas is becoming much more achievable. Revisiting the example of autonomous vehicles, they will be moving constantly and having edge locations for those vehicles to communicate is crucial for their successful adoption. 

It goes without saying that latency for applications and databases can be greatly reduced by placing the compute closer to them. Edge compute will require a redesign for some applications as they will need to communicate differently, while also ensuring all underlying security needs are fulfilled.  The possibilities for IoT sensors and edge compute are endless as everything in our world is becoming a connected device. Our cars, refrigerators, washers/dryers, heating, and AC systems all have sensors in them to tell us when to service them or even to allow us to control them remotely via our cell phone or tablets. 

Edge compute is not just for the enterprise customer, it will affect the end consumers directly and indirectly, it will ultimately make our lives easier by reducing if not eliminating tasks that require a great deal of interaction from us today. We are already seeing some of the benefits from IoT in our cars and appliances for service reminders. Edge compute is just beginning to catch on and will continue to play a large part of our futures for both the enterprises and consumers of the world. Stay tune for more updates on the progress being made in the areas of edge compute, IoT and 5G enablement.

If you would like to discuss any cloud infrastructure matters or direct opportunities, you can click here to book time with Jim or submit a Tech Gurus support request via TBI’s OnDemand platform found here. Also, be sure to connect with Jim on LinkedIn.


An accomplished AWS Certified Solution Architect, Jim Demetrius brings over 30 years of experience helping organizations to implement highly secure and cost effective solutions that can grow with their business. Working with many notable Fortune 100 companies, he has worked in all aspects of the IT business lifecycle, garner in-depth knowledge on complex managed hosting, cloud, colocation and hybrid computing.