How Edge Computing Will Drive Power Innovations
Whilst many everyday people are still trying to come to terms with and understand what ‘Cloud’ computing means along comes ‘Edge’ computing. The two are in fact inherently related.
Few realise that Amazon through its AWS (Amazon Web Services) is one of the largest cloud-based datacentre service providers. Microsoft and Google are others. What these companies offer is a network of datacentres housing thousands of servers providing a cloud-based service. You can open an account, store, process, back-up and archive your data, whether a business, organisation or individual. You purchase from the provider a service and it is up to the organisation how they organise their server capacity to provide this, building in redundancy and resilient as part of the service terms.
Some of these clouded services have become everyday apps including Dropbox, Gmail, Office 365 and Slack. Content we access for streaming including Amazon, Google, Apple, Netflix, the BBC and ITV iPlayers are all users of the cloud whether they own them or subscribe to the service.
Edge computing is the latest development in the datacentre world. So what exactly is the Edge and what does it mean for organisations running their own server rooms. The Edge or Edge Computing pushes centralised cloud-based services closer to the user. Edge computing represents a distribution of data services closer to the source and area of generation and usage.
Why? One of reasons is to reduce latency or the time it takes to access information. Speed is a strategic differentiator in most business environments and website click-throughs are often determined by loading speed and the time it takes to access information or buy a product or service online. However, one must think wider than this and how technology will evolve in the years to come.
We are on the verge of over 20billion devices being connected worldwide to the Internet of Things (IoT) within the next 5 years. Here we are not just talking computing, mobile phones, tablets and laptops but assistants like the Amazon Alexa, Google Home and Apple HomePod, intelligent home systems (lighting, thermostats, fridges, TVs, cookers and kettles), industry 4.0 applications, self-driving cars and lorries and delivery drones.
Edge computing provides a way to solve latency issues and provide the scale of processing needed for the IoT applications and the new types of services that will evolve from greater connectivity.
So if that’s summarised what the ‘Edge’ is, how can it be delivered from a hardware perspective? We forecast that continued development of on-site server rooms and controlled IT environments. These may be developed as individual rooms or deployed as self-contained micro-datacentres. The latter may be more practical and economic to deploy where there is a need for large amounts of computing power, in a tightly controlled environment and possibly where space is limited or at a premium.
Edge facilities and micro-datacentres will still need the type of critical systems we see within a server room or datacentre environment. These include uninterruptible power supplies, power distribution, cooing, access control, fire suppression and environmental monitoring. The point with a micro-datacentre is that this is more than a containerised building. It is a self-contained room or system that can be built-up on-site and deployed rapidly to meet growth and demand.
One aspect that aids quick deployment is standardisation and Edge computing will lead to new developments and standards. One area to consider is the role of uninterruptible power supplies and their battery packs. A UPS system has traditionally used lead acid batteries which are suitable for standby power role. When mains power is present, the UPS batteries are charged up and the charge maintained. The batteries are only used when the mains power supply fluctuates or fails completely. The batteries discharge their power to the UPS inverter which powers the load until either a local standby power generator starts up or the mains power supply returns. Worse case there is no back-up power generating set and/or the batteries fail before mains power is restored.
Some UPS systems can now be installed with lithium-ion battery packs. There are several types of Li-ion battery (visit: https://en.wikipedia.org/wiki/Lithium-ion_battery), but the operational characteristics of the type used with UPS systems is very similar to that of mobile phones and tablets. These batteries are designed for continuous cyclic usage in terms of charge/recharge cycles and can recharge more rapidly than lead acid batteries. Whilst they are more expensive the battery technology working life can be double that of a lead acid battery and achieve a lower Total Cost of Ownership (TCO).
The amount of energy storage in lithium-ion batteries will continue to evolve quickly as more applications adopt this type of battery, including electric vehicles as well as UPS systems. This gives rise to other potential applications including demand side response programs that pay for UPS and EV users to allow their stored energy to be used to supplement grid power levels and availability.
A wide spread usage of Edge computing and micro-datacentres also provides greater opportunities for the use of artificial intelligence (AI) within IT networks. Examples include automating demand side response programs at the speed required to maintain uptime and allow synchronisation of multiple storage resources between buildings and even substations. Whilst power demand will continue to increase rapidly, Edge technologies and the Internet of Things provide opportunities to maintain uptime and improve response speeds. These environments and applications require the same level of design and build consultancy required for server room or datacentre environments if they are to be resilient, energy efficient and achieve the best possible TCO for their owners and operators.